Oct 01 16:02:15 crc systemd[1]: Starting Kubernetes Kubelet... Oct 01 16:02:15 crc restorecon[4735]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:15 crc restorecon[4735]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 16:02:16 crc restorecon[4735]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 16:02:16 crc restorecon[4735]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 01 16:02:17 crc kubenswrapper[4764]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 16:02:17 crc kubenswrapper[4764]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 01 16:02:17 crc kubenswrapper[4764]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 16:02:17 crc kubenswrapper[4764]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 16:02:17 crc kubenswrapper[4764]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 01 16:02:17 crc kubenswrapper[4764]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.498070 4764 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501034 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501067 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501072 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501077 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501082 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501087 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501092 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501096 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501100 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501104 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501107 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501111 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501115 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501118 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501122 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501125 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501129 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501133 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501137 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501141 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501145 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501150 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501153 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501849 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501869 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501875 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501879 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501889 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501894 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501899 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501906 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501916 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501924 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501930 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501937 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501941 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501946 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501951 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501957 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501962 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501966 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501970 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501974 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501978 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501982 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501986 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501992 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.501997 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502002 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502006 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502013 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502018 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502023 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502027 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502031 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502036 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502061 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502071 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502078 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502085 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502094 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502099 4764 feature_gate.go:330] unrecognized feature gate: Example Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502105 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502111 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502116 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502121 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502126 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502131 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502135 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502140 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.502145 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502251 4764 flags.go:64] FLAG: --address="0.0.0.0" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502261 4764 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502269 4764 flags.go:64] FLAG: --anonymous-auth="true" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502275 4764 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502281 4764 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502285 4764 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502291 4764 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502297 4764 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502302 4764 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502307 4764 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502311 4764 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502317 4764 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502322 4764 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502327 4764 flags.go:64] FLAG: --cgroup-root="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502332 4764 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502337 4764 flags.go:64] FLAG: --client-ca-file="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502342 4764 flags.go:64] FLAG: --cloud-config="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502347 4764 flags.go:64] FLAG: --cloud-provider="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502352 4764 flags.go:64] FLAG: --cluster-dns="[]" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502357 4764 flags.go:64] FLAG: --cluster-domain="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502361 4764 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502366 4764 flags.go:64] FLAG: --config-dir="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502370 4764 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502374 4764 flags.go:64] FLAG: --container-log-max-files="5" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502380 4764 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502384 4764 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502388 4764 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502392 4764 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502396 4764 flags.go:64] FLAG: --contention-profiling="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502400 4764 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502405 4764 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502409 4764 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502413 4764 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502419 4764 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502423 4764 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502427 4764 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502431 4764 flags.go:64] FLAG: --enable-load-reader="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502436 4764 flags.go:64] FLAG: --enable-server="true" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502440 4764 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502446 4764 flags.go:64] FLAG: --event-burst="100" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502454 4764 flags.go:64] FLAG: --event-qps="50" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502459 4764 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502464 4764 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502469 4764 flags.go:64] FLAG: --eviction-hard="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502475 4764 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502479 4764 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502484 4764 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502488 4764 flags.go:64] FLAG: --eviction-soft="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502492 4764 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502496 4764 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502500 4764 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502505 4764 flags.go:64] FLAG: --experimental-mounter-path="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502509 4764 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502513 4764 flags.go:64] FLAG: --fail-swap-on="true" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502517 4764 flags.go:64] FLAG: --feature-gates="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502522 4764 flags.go:64] FLAG: --file-check-frequency="20s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502527 4764 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502531 4764 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502536 4764 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502540 4764 flags.go:64] FLAG: --healthz-port="10248" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502544 4764 flags.go:64] FLAG: --help="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502548 4764 flags.go:64] FLAG: --hostname-override="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502552 4764 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502557 4764 flags.go:64] FLAG: --http-check-frequency="20s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502561 4764 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502566 4764 flags.go:64] FLAG: --image-credential-provider-config="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502569 4764 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502573 4764 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502577 4764 flags.go:64] FLAG: --image-service-endpoint="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502582 4764 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502585 4764 flags.go:64] FLAG: --kube-api-burst="100" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502590 4764 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502594 4764 flags.go:64] FLAG: --kube-api-qps="50" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502598 4764 flags.go:64] FLAG: --kube-reserved="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502602 4764 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502606 4764 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502612 4764 flags.go:64] FLAG: --kubelet-cgroups="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502616 4764 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502621 4764 flags.go:64] FLAG: --lock-file="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502625 4764 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502629 4764 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502633 4764 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502639 4764 flags.go:64] FLAG: --log-json-split-stream="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502643 4764 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502647 4764 flags.go:64] FLAG: --log-text-split-stream="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502651 4764 flags.go:64] FLAG: --logging-format="text" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502655 4764 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502659 4764 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502663 4764 flags.go:64] FLAG: --manifest-url="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502667 4764 flags.go:64] FLAG: --manifest-url-header="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502673 4764 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502677 4764 flags.go:64] FLAG: --max-open-files="1000000" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502682 4764 flags.go:64] FLAG: --max-pods="110" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502686 4764 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502690 4764 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502694 4764 flags.go:64] FLAG: --memory-manager-policy="None" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502698 4764 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502702 4764 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502706 4764 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502710 4764 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502719 4764 flags.go:64] FLAG: --node-status-max-images="50" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502724 4764 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502728 4764 flags.go:64] FLAG: --oom-score-adj="-999" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502734 4764 flags.go:64] FLAG: --pod-cidr="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502739 4764 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502749 4764 flags.go:64] FLAG: --pod-manifest-path="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502754 4764 flags.go:64] FLAG: --pod-max-pids="-1" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502758 4764 flags.go:64] FLAG: --pods-per-core="0" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502763 4764 flags.go:64] FLAG: --port="10250" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502768 4764 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502773 4764 flags.go:64] FLAG: --provider-id="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502778 4764 flags.go:64] FLAG: --qos-reserved="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502784 4764 flags.go:64] FLAG: --read-only-port="10255" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502789 4764 flags.go:64] FLAG: --register-node="true" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502795 4764 flags.go:64] FLAG: --register-schedulable="true" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502800 4764 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502809 4764 flags.go:64] FLAG: --registry-burst="10" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502814 4764 flags.go:64] FLAG: --registry-qps="5" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502819 4764 flags.go:64] FLAG: --reserved-cpus="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502824 4764 flags.go:64] FLAG: --reserved-memory="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502829 4764 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502833 4764 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502838 4764 flags.go:64] FLAG: --rotate-certificates="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502842 4764 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502867 4764 flags.go:64] FLAG: --runonce="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502871 4764 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502875 4764 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502880 4764 flags.go:64] FLAG: --seccomp-default="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502884 4764 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502888 4764 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502892 4764 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502896 4764 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502901 4764 flags.go:64] FLAG: --storage-driver-password="root" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502906 4764 flags.go:64] FLAG: --storage-driver-secure="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502914 4764 flags.go:64] FLAG: --storage-driver-table="stats" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502928 4764 flags.go:64] FLAG: --storage-driver-user="root" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502937 4764 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502942 4764 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502947 4764 flags.go:64] FLAG: --system-cgroups="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502952 4764 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502960 4764 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502964 4764 flags.go:64] FLAG: --tls-cert-file="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502969 4764 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502974 4764 flags.go:64] FLAG: --tls-min-version="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502978 4764 flags.go:64] FLAG: --tls-private-key-file="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502982 4764 flags.go:64] FLAG: --topology-manager-policy="none" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502986 4764 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502990 4764 flags.go:64] FLAG: --topology-manager-scope="container" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.502996 4764 flags.go:64] FLAG: --v="2" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.503002 4764 flags.go:64] FLAG: --version="false" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.503008 4764 flags.go:64] FLAG: --vmodule="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.503013 4764 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.503018 4764 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503153 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503162 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503168 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503173 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503177 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503181 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503185 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503189 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503193 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503196 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503200 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503203 4764 feature_gate.go:330] unrecognized feature gate: Example Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503207 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503211 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503218 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503223 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503227 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503231 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503234 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503239 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503243 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503247 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503250 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503253 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503257 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503261 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503264 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503267 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503271 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503274 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503278 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503283 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503287 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503290 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503294 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503297 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503301 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503304 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503308 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503311 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503316 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503320 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503324 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503329 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503333 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503337 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503346 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503351 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503355 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503358 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503362 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503366 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503369 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503372 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503376 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503379 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503383 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503387 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503390 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503394 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503397 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503401 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503404 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503408 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503412 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503415 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503420 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503425 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503429 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503433 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.503436 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.503449 4764 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.513125 4764 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.513178 4764 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513261 4764 feature_gate.go:330] unrecognized feature gate: Example Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513271 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513276 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513280 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513286 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513290 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513295 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513300 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513304 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513308 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513313 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513317 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513321 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513326 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513331 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513337 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513344 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513350 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513356 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513361 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513367 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513372 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513378 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513383 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513388 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513392 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513396 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513401 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513405 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513410 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513414 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513419 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513423 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513428 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513434 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513438 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513442 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513447 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513452 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513457 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513463 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513468 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513473 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513477 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513491 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513496 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513500 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513505 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513511 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513519 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513524 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513529 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513533 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513538 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513543 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513549 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513555 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513559 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513564 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513569 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513573 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513578 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513583 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513587 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513591 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513595 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513601 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513606 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513611 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513615 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513621 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.513629 4764 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513781 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513789 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513795 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513799 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513803 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513808 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513812 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513817 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513821 4764 feature_gate.go:330] unrecognized feature gate: Example Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513827 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513831 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513836 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513840 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513844 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513849 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513854 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513858 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513862 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513866 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513870 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513874 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513878 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513883 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513887 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513891 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513895 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513899 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513904 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513908 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513912 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513916 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513921 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513925 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513929 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513934 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513939 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513944 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513949 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513954 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513960 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513965 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513970 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513976 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513983 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513988 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.513994 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514003 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514008 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514013 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514017 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514022 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514026 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514030 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514035 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514040 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514059 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514064 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514069 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514073 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514077 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514082 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514087 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514091 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514096 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514100 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514104 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514110 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514115 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514120 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514125 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.514130 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.514141 4764 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.514338 4764 server.go:940] "Client rotation is on, will bootstrap in background" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.518638 4764 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.519453 4764 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.521199 4764 server.go:997] "Starting client certificate rotation" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.521232 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.524253 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-18 04:03:06.93171258 +0000 UTC Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.524360 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2604h0m49.407354879s for next certificate rotation Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.549864 4764 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.554398 4764 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.573083 4764 log.go:25] "Validated CRI v1 runtime API" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.602326 4764 log.go:25] "Validated CRI v1 image API" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.605092 4764 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.610503 4764 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-01-15-57-54-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.610558 4764 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.638682 4764 manager.go:217] Machine: {Timestamp:2025-10-01 16:02:17.635224212 +0000 UTC m=+0.634871097 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5f30d9a2-b6a5-482f-9083-66d464270d1d BootID:2a812319-9b55-40ee-9d8a-92eb5dff7a6a Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a3:30:66 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a3:30:66 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:65:d1:1e Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:57:25:d1 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:be:ec:14 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f6:4b:da Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:19:e1:82 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4e:16:0e:5c:37:ec Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:be:db:d7:ee:c0:79 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.639113 4764 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.639340 4764 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.642142 4764 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.642500 4764 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.642614 4764 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.642879 4764 topology_manager.go:138] "Creating topology manager with none policy" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.642894 4764 container_manager_linux.go:303] "Creating device plugin manager" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.643589 4764 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.643630 4764 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.643947 4764 state_mem.go:36] "Initialized new in-memory state store" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.644072 4764 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.647382 4764 kubelet.go:418] "Attempting to sync node with API server" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.647404 4764 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.647420 4764 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.647431 4764 kubelet.go:324] "Adding apiserver pod source" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.647443 4764 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.650784 4764 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.652202 4764 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.653717 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.653721 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:17 crc kubenswrapper[4764]: E1001 16:02:17.653792 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.203:6443: connect: connection refused" logger="UnhandledError" Oct 01 16:02:17 crc kubenswrapper[4764]: E1001 16:02:17.653809 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.203:6443: connect: connection refused" logger="UnhandledError" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.657775 4764 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.660087 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.660125 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.660134 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.660141 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.660151 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.660158 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.660165 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.660176 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.660190 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.660197 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.660208 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.660214 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.663423 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.664620 4764 server.go:1280] "Started kubelet" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.665450 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.666016 4764 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.666017 4764 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.666635 4764 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 01 16:02:17 crc systemd[1]: Started Kubernetes Kubelet. Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.668553 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.668589 4764 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.668832 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 22:28:39.462413449 +0000 UTC Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.668875 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1878h26m21.793541269s for next certificate rotation Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.669651 4764 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.669692 4764 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.669761 4764 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.670818 4764 server.go:460] "Adding debug handlers to kubelet server" Oct 01 16:02:17 crc kubenswrapper[4764]: E1001 16:02:17.671870 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 16:02:17 crc kubenswrapper[4764]: E1001 16:02:17.671930 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.203:6443: connect: connection refused" interval="200ms" Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.672225 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:17 crc kubenswrapper[4764]: E1001 16:02:17.672288 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.203:6443: connect: connection refused" logger="UnhandledError" Oct 01 16:02:17 crc kubenswrapper[4764]: E1001 16:02:17.671370 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.203:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a6970ce9305dc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-01 16:02:17.66459542 +0000 UTC m=+0.664242255,LastTimestamp:2025-10-01 16:02:17.66459542 +0000 UTC m=+0.664242255,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.675701 4764 factory.go:55] Registering systemd factory Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.676469 4764 factory.go:221] Registration of the systemd container factory successfully Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.676757 4764 factory.go:153] Registering CRI-O factory Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.676789 4764 factory.go:221] Registration of the crio container factory successfully Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.676911 4764 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.676953 4764 factory.go:103] Registering Raw factory Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.676974 4764 manager.go:1196] Started watching for new ooms in manager Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677417 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677453 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677467 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677478 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677489 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677502 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677513 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677549 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677563 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677574 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677585 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677596 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677609 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677613 4764 manager.go:319] Starting recovery of all containers Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677621 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677633 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677645 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677656 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677666 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677677 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677688 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677698 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677709 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677720 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677731 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677743 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677754 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.677767 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678019 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678033 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678058 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678071 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678083 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678096 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678106 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678117 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678129 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678140 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678152 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678162 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678173 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678185 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678197 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678226 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678237 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678248 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678259 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678271 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678285 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678297 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678311 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678324 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678341 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678368 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678381 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678393 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678403 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678415 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678425 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678437 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678454 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678493 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678505 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678515 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678528 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678540 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678551 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678562 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678575 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678587 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678598 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678609 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678620 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678631 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678644 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678660 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678672 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678683 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678694 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678705 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678718 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678729 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678742 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678753 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678766 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678779 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678792 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678805 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678818 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678829 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678841 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678854 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678869 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678882 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678894 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678907 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678920 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678933 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678945 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678958 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678971 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678983 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.678994 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679006 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679019 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679037 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679066 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679099 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679114 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679127 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679141 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679153 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679169 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679182 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679195 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679213 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679228 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679241 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679253 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679264 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679276 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.679287 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682338 4764 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682371 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682388 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682401 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682415 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682429 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682444 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682494 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682507 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682519 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682533 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682546 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682560 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682572 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682584 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682598 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682611 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682623 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682636 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682665 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682679 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682692 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682705 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682717 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.682730 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686394 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686416 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686440 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686500 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686522 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686536 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686551 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686573 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686587 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686603 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686622 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686638 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686659 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686674 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686690 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686713 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686728 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686747 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686762 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686777 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686798 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686812 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686831 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686848 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686864 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.686883 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688724 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688764 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688775 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688789 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688800 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688814 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688824 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688834 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688848 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688858 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688871 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688880 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688889 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688902 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688912 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688922 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688935 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688945 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688959 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688969 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688978 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.688991 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689000 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689013 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689023 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689033 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689067 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689082 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689100 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689112 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689176 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689186 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689230 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689712 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689746 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689761 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689773 4764 reconstruct.go:97] "Volume reconstruction finished" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.689783 4764 reconciler.go:26] "Reconciler: start to sync state" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.700826 4764 manager.go:324] Recovery completed Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.708994 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.710597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.710759 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.710884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.715806 4764 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.715833 4764 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.715855 4764 state_mem.go:36] "Initialized new in-memory state store" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.717774 4764 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.720262 4764 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.720449 4764 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.720531 4764 kubelet.go:2335] "Starting kubelet main sync loop" Oct 01 16:02:17 crc kubenswrapper[4764]: E1001 16:02:17.720801 4764 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 01 16:02:17 crc kubenswrapper[4764]: W1001 16:02:17.724941 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:17 crc kubenswrapper[4764]: E1001 16:02:17.725136 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.203:6443: connect: connection refused" logger="UnhandledError" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.738366 4764 policy_none.go:49] "None policy: Start" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.739594 4764 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.739634 4764 state_mem.go:35] "Initializing new in-memory state store" Oct 01 16:02:17 crc kubenswrapper[4764]: E1001 16:02:17.772450 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.808668 4764 manager.go:334] "Starting Device Plugin manager" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.808821 4764 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.808914 4764 server.go:79] "Starting device plugin registration server" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.809433 4764 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.809505 4764 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.810085 4764 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.810326 4764 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.810342 4764 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 01 16:02:17 crc kubenswrapper[4764]: E1001 16:02:17.821192 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.822259 4764 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.822351 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.823498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.823531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.823543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.823679 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.823877 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.823912 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.824446 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.824465 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.824475 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.824452 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.824543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.824551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.824653 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.824753 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.824831 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.825404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.825437 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.825446 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.825581 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.825822 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.825901 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.826444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.826478 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.826486 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.826512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.826543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.826490 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.826756 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.826914 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.826947 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.827479 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.827503 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.827512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.827985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.827992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.828031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.828015 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.828062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.828069 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.828195 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.828212 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.828780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.828843 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.828857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:17 crc kubenswrapper[4764]: E1001 16:02:17.872465 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.203:6443: connect: connection refused" interval="400ms" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891273 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891323 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891351 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891382 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891406 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891432 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891539 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891600 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891627 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891646 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891667 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891687 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891706 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891751 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.891857 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.910109 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.911656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.911705 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.911716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.911748 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 16:02:17 crc kubenswrapper[4764]: E1001 16:02:17.912387 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.203:6443: connect: connection refused" node="crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992718 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992793 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992832 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992872 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992890 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992908 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992925 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992949 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992969 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992970 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.992986 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993004 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993085 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993112 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993124 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993155 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993177 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993179 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993212 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993226 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993240 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993251 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993240 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993265 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993307 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 16:02:17 crc kubenswrapper[4764]: I1001 16:02:17.993315 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.113315 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.115451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.115494 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.115507 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.115532 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 16:02:18 crc kubenswrapper[4764]: E1001 16:02:18.116243 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.203:6443: connect: connection refused" node="crc" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.146236 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.154996 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.177406 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:18 crc kubenswrapper[4764]: W1001 16:02:18.194598 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-6600e12cc2932a3f16cdcd42a8a63b1d6afacc8486232eb145dbbe52cf3da983 WatchSource:0}: Error finding container 6600e12cc2932a3f16cdcd42a8a63b1d6afacc8486232eb145dbbe52cf3da983: Status 404 returned error can't find the container with id 6600e12cc2932a3f16cdcd42a8a63b1d6afacc8486232eb145dbbe52cf3da983 Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.195076 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:18 crc kubenswrapper[4764]: W1001 16:02:18.195355 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-993b04381cd4c9f43173c19d6659308cd483a8b3cd58c36526fe19c8ba0db3cd WatchSource:0}: Error finding container 993b04381cd4c9f43173c19d6659308cd483a8b3cd58c36526fe19c8ba0db3cd: Status 404 returned error can't find the container with id 993b04381cd4c9f43173c19d6659308cd483a8b3cd58c36526fe19c8ba0db3cd Oct 01 16:02:18 crc kubenswrapper[4764]: W1001 16:02:18.199745 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-241cd389f08583566d789f9c2700fe5e80760cc7cba6b0e7b5830d9b5de3a354 WatchSource:0}: Error finding container 241cd389f08583566d789f9c2700fe5e80760cc7cba6b0e7b5830d9b5de3a354: Status 404 returned error can't find the container with id 241cd389f08583566d789f9c2700fe5e80760cc7cba6b0e7b5830d9b5de3a354 Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.200377 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 16:02:18 crc kubenswrapper[4764]: W1001 16:02:18.206213 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-438b644d89d8c84915a2be48c10858ee9e3e4cc4f142d8b697e29c769d2c7da3 WatchSource:0}: Error finding container 438b644d89d8c84915a2be48c10858ee9e3e4cc4f142d8b697e29c769d2c7da3: Status 404 returned error can't find the container with id 438b644d89d8c84915a2be48c10858ee9e3e4cc4f142d8b697e29c769d2c7da3 Oct 01 16:02:18 crc kubenswrapper[4764]: W1001 16:02:18.209631 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-706d0a2b6d0c44ca3728235e9cc7872655e6f219d4b3c6abc4a6bf2ee0583dc8 WatchSource:0}: Error finding container 706d0a2b6d0c44ca3728235e9cc7872655e6f219d4b3c6abc4a6bf2ee0583dc8: Status 404 returned error can't find the container with id 706d0a2b6d0c44ca3728235e9cc7872655e6f219d4b3c6abc4a6bf2ee0583dc8 Oct 01 16:02:18 crc kubenswrapper[4764]: E1001 16:02:18.273145 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.203:6443: connect: connection refused" interval="800ms" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.517096 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.518208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.518238 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.518247 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.518269 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 16:02:18 crc kubenswrapper[4764]: E1001 16:02:18.518537 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.203:6443: connect: connection refused" node="crc" Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.666652 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.725195 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6600e12cc2932a3f16cdcd42a8a63b1d6afacc8486232eb145dbbe52cf3da983"} Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.727906 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"706d0a2b6d0c44ca3728235e9cc7872655e6f219d4b3c6abc4a6bf2ee0583dc8"} Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.731216 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"438b644d89d8c84915a2be48c10858ee9e3e4cc4f142d8b697e29c769d2c7da3"} Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.733003 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"241cd389f08583566d789f9c2700fe5e80760cc7cba6b0e7b5830d9b5de3a354"} Oct 01 16:02:18 crc kubenswrapper[4764]: I1001 16:02:18.733619 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"993b04381cd4c9f43173c19d6659308cd483a8b3cd58c36526fe19c8ba0db3cd"} Oct 01 16:02:18 crc kubenswrapper[4764]: W1001 16:02:18.909139 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:18 crc kubenswrapper[4764]: E1001 16:02:18.909223 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.203:6443: connect: connection refused" logger="UnhandledError" Oct 01 16:02:18 crc kubenswrapper[4764]: W1001 16:02:18.968032 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:18 crc kubenswrapper[4764]: E1001 16:02:18.968137 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.203:6443: connect: connection refused" logger="UnhandledError" Oct 01 16:02:18 crc kubenswrapper[4764]: W1001 16:02:18.972108 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:18 crc kubenswrapper[4764]: E1001 16:02:18.972177 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.203:6443: connect: connection refused" logger="UnhandledError" Oct 01 16:02:19 crc kubenswrapper[4764]: E1001 16:02:19.074675 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.203:6443: connect: connection refused" interval="1.6s" Oct 01 16:02:19 crc kubenswrapper[4764]: W1001 16:02:19.212931 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:19 crc kubenswrapper[4764]: E1001 16:02:19.213089 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.203:6443: connect: connection refused" logger="UnhandledError" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.318874 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.320451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.320511 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.320525 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.320559 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 16:02:19 crc kubenswrapper[4764]: E1001 16:02:19.321161 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.203:6443: connect: connection refused" node="crc" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.666979 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.737499 4764 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d" exitCode=0 Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.737575 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d"} Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.737619 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.738730 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.738830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.738845 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.739752 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0"} Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.739833 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02"} Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.739850 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d"} Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.739869 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074"} Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.739980 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.740918 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.740962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.740973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.742001 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81" exitCode=0 Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.742084 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.742088 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81"} Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.742736 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.742759 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.742768 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.743857 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.744182 4764 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e" exitCode=0 Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.744253 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.744270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e"} Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.744658 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.744692 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.744702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.745729 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.745755 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.745763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.747730 4764 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d" exitCode=0 Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.747767 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d"} Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.747815 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.748561 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.748585 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:19 crc kubenswrapper[4764]: I1001 16:02:19.748593 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:20 crc kubenswrapper[4764]: W1001 16:02:20.621014 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:20 crc kubenswrapper[4764]: E1001 16:02:20.621182 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.203:6443: connect: connection refused" logger="UnhandledError" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.666544 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:20 crc kubenswrapper[4764]: E1001 16:02:20.675888 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.203:6443: connect: connection refused" interval="3.2s" Oct 01 16:02:20 crc kubenswrapper[4764]: W1001 16:02:20.742308 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:20 crc kubenswrapper[4764]: E1001 16:02:20.742462 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.203:6443: connect: connection refused" logger="UnhandledError" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.756151 4764 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be" exitCode=0 Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.756283 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be"} Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.756366 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.757440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.757488 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.757503 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.759978 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.759967 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d67eb7f641077dbdc2785600e9a2efc1c3e75dcafa93923f6a0ccc9b577cad07"} Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.761547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.761587 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.761601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.765748 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04"} Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.765792 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4"} Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.765807 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886"} Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.766753 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.768009 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.768071 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.768089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.771273 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301"} Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.771333 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3"} Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.771348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1"} Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.771357 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.771361 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e"} Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.772571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.772615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.772629 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.922121 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.923322 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.923395 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.923409 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.923448 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 16:02:20 crc kubenswrapper[4764]: E1001 16:02:20.924181 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.203:6443: connect: connection refused" node="crc" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.942770 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:20 crc kubenswrapper[4764]: I1001 16:02:20.952166 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:21 crc kubenswrapper[4764]: W1001 16:02:21.063695 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.203:6443: connect: connection refused Oct 01 16:02:21 crc kubenswrapper[4764]: E1001 16:02:21.063767 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.203:6443: connect: connection refused" logger="UnhandledError" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.776245 4764 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5" exitCode=0 Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.776338 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5"} Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.776459 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.777608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.777661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.777678 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.782298 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.783087 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.783747 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32"} Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.783801 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.783895 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.784625 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.785243 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.786317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.786345 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.786358 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.786375 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.786431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.786454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.786469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.786618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.786640 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.786654 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.786374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.786708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:21 crc kubenswrapper[4764]: I1001 16:02:21.896836 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.787549 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.787945 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70"} Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.787969 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0"} Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.787978 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f"} Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.787989 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7"} Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.788078 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.788358 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.788416 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.788942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.788965 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.788974 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.789455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.789473 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.789480 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.790017 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.790034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:22 crc kubenswrapper[4764]: I1001 16:02:22.790069 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.466163 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.799765 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623"} Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.799850 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.799883 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.799951 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.801035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.801087 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.801098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.801035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.801171 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.801189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.801553 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.801582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:23 crc kubenswrapper[4764]: I1001 16:02:23.801593 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.124926 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.126672 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.126736 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.126809 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.126846 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.127192 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.802114 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.803154 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.803202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.803214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.803225 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.804279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.804339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:24 crc kubenswrapper[4764]: I1001 16:02:24.804362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:25 crc kubenswrapper[4764]: I1001 16:02:25.057122 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:25 crc kubenswrapper[4764]: I1001 16:02:25.057282 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:25 crc kubenswrapper[4764]: I1001 16:02:25.058328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:25 crc kubenswrapper[4764]: I1001 16:02:25.058358 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:25 crc kubenswrapper[4764]: I1001 16:02:25.058366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:25 crc kubenswrapper[4764]: I1001 16:02:25.804847 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:25 crc kubenswrapper[4764]: I1001 16:02:25.805781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:25 crc kubenswrapper[4764]: I1001 16:02:25.805816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:25 crc kubenswrapper[4764]: I1001 16:02:25.805829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:26 crc kubenswrapper[4764]: I1001 16:02:26.740228 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:26 crc kubenswrapper[4764]: I1001 16:02:26.740487 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:26 crc kubenswrapper[4764]: I1001 16:02:26.741976 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:26 crc kubenswrapper[4764]: I1001 16:02:26.742035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:26 crc kubenswrapper[4764]: I1001 16:02:26.742075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:27 crc kubenswrapper[4764]: E1001 16:02:27.821926 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 16:02:28 crc kubenswrapper[4764]: I1001 16:02:28.057557 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 16:02:28 crc kubenswrapper[4764]: I1001 16:02:28.057692 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 16:02:28 crc kubenswrapper[4764]: I1001 16:02:28.403244 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:28 crc kubenswrapper[4764]: I1001 16:02:28.403643 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:28 crc kubenswrapper[4764]: I1001 16:02:28.404964 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:28 crc kubenswrapper[4764]: I1001 16:02:28.405085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:28 crc kubenswrapper[4764]: I1001 16:02:28.405164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:31 crc kubenswrapper[4764]: I1001 16:02:31.667817 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 01 16:02:31 crc kubenswrapper[4764]: I1001 16:02:31.729888 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 01 16:02:31 crc kubenswrapper[4764]: I1001 16:02:31.730343 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:31 crc kubenswrapper[4764]: I1001 16:02:31.731641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:31 crc kubenswrapper[4764]: I1001 16:02:31.731692 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:31 crc kubenswrapper[4764]: I1001 16:02:31.731705 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:31 crc kubenswrapper[4764]: W1001 16:02:31.986496 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 01 16:02:31 crc kubenswrapper[4764]: I1001 16:02:31.986595 4764 trace.go:236] Trace[2007299585]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 16:02:21.984) (total time: 10001ms): Oct 01 16:02:31 crc kubenswrapper[4764]: Trace[2007299585]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (16:02:31.986) Oct 01 16:02:31 crc kubenswrapper[4764]: Trace[2007299585]: [10.001589434s] [10.001589434s] END Oct 01 16:02:31 crc kubenswrapper[4764]: E1001 16:02:31.986623 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 01 16:02:32 crc kubenswrapper[4764]: I1001 16:02:32.012922 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 16:02:32 crc kubenswrapper[4764]: I1001 16:02:32.012981 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 16:02:32 crc kubenswrapper[4764]: I1001 16:02:32.016657 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 16:02:32 crc kubenswrapper[4764]: I1001 16:02:32.016722 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 16:02:36 crc kubenswrapper[4764]: I1001 16:02:36.746171 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:36 crc kubenswrapper[4764]: I1001 16:02:36.746322 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:36 crc kubenswrapper[4764]: I1001 16:02:36.747690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:36 crc kubenswrapper[4764]: I1001 16:02:36.747757 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:36 crc kubenswrapper[4764]: I1001 16:02:36.747774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:36 crc kubenswrapper[4764]: I1001 16:02:36.750972 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:36 crc kubenswrapper[4764]: I1001 16:02:36.832744 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:36 crc kubenswrapper[4764]: I1001 16:02:36.833597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:36 crc kubenswrapper[4764]: I1001 16:02:36.833767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:36 crc kubenswrapper[4764]: I1001 16:02:36.833888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.004381 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.007458 4764 trace.go:236] Trace[1825486427]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 16:02:24.611) (total time: 12396ms): Oct 01 16:02:37 crc kubenswrapper[4764]: Trace[1825486427]: ---"Objects listed" error: 12396ms (16:02:37.007) Oct 01 16:02:37 crc kubenswrapper[4764]: Trace[1825486427]: [12.396116915s] [12.396116915s] END Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.007499 4764 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.007505 4764 trace.go:236] Trace[2075668036]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 16:02:24.330) (total time: 12676ms): Oct 01 16:02:37 crc kubenswrapper[4764]: Trace[2075668036]: ---"Objects listed" error: 12676ms (16:02:37.007) Oct 01 16:02:37 crc kubenswrapper[4764]: Trace[2075668036]: [12.676748065s] [12.676748065s] END Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.007529 4764 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.007679 4764 trace.go:236] Trace[1336500497]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 16:02:24.376) (total time: 12630ms): Oct 01 16:02:37 crc kubenswrapper[4764]: Trace[1336500497]: ---"Objects listed" error: 12630ms (16:02:37.007) Oct 01 16:02:37 crc kubenswrapper[4764]: Trace[1336500497]: [12.6309925s] [12.6309925s] END Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.007693 4764 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.009307 4764 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.009489 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.055454 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33848->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.055482 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33852->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.055530 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33852->192.168.126.11:17697: read: connection reset by peer" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.055521 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33848->192.168.126.11:17697: read: connection reset by peer" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.055970 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.055993 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.082653 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.095728 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.498570 4764 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.661565 4764 apiserver.go:52] "Watching apiserver" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.665073 4764 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.665560 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.665992 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.666105 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.666252 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.666505 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.666511 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.666734 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.666916 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.666930 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.667227 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.668318 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.668430 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.668447 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.668735 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.668751 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.669224 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.669623 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.670338 4764 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.670847 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.671869 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.693941 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.707521 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.712744 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.712786 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.712808 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.712831 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.712854 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.712873 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.712895 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.712916 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.712941 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.712962 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.712983 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713004 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713028 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713064 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713086 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713104 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713126 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713150 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713163 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713181 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713170 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713263 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713269 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713289 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713363 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713395 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713419 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713441 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713463 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713474 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713486 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713507 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713529 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713550 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713572 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713596 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713619 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713639 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713659 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713682 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713704 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713726 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713752 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713773 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713796 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713810 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713819 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713872 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713924 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713953 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713978 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714003 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714026 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714069 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714094 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714117 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714138 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714159 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714181 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714205 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714227 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714249 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714271 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714293 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714315 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714342 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714364 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714388 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714411 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714434 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714455 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714478 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714501 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714525 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714546 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714570 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714597 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714619 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714646 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714678 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714701 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714729 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714752 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714774 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714798 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714821 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714846 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714886 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714914 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714937 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714959 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714984 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715007 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715030 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715071 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715141 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715169 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715196 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715218 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715241 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715263 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715319 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715346 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715368 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715438 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715462 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715489 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715515 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715538 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715563 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715589 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715613 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715666 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715691 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715715 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715740 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715767 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715791 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715814 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715838 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715863 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715887 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715914 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715939 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715965 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715992 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716017 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716040 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716092 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716117 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716143 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716168 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716192 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716218 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716242 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716266 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716291 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716317 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716342 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716365 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716389 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716476 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716504 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716530 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716565 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716596 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716622 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716646 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716671 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716695 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716720 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716745 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716770 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716795 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716819 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716844 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716869 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716895 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716919 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716946 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716976 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717000 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717027 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717424 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717453 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717479 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717502 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717526 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717549 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717573 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717596 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717618 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717642 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717662 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717686 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717711 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717735 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717758 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717778 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717801 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717828 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717852 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717877 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717903 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717932 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717958 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717983 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718007 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718031 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718087 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718119 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718146 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718173 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713860 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.713926 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714165 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714206 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714228 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714401 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714417 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714442 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.720792 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.720823 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.720872 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714665 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714682 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714840 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714862 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714864 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.714958 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715259 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715522 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715552 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715599 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715622 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.715880 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716282 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716419 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716844 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.716861 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717143 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717166 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717156 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717470 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717506 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.717575 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718008 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718108 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718164 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718188 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718401 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718442 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718480 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718725 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718945 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718993 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.719108 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.719148 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.719340 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.719463 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.721264 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.719514 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.719567 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.719876 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.719879 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.719956 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.720366 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.720449 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.720495 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.720631 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.721023 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.721297 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.721380 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.721345 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.721490 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.721739 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.721755 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.721853 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.722219 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.722243 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.722252 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.722538 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.722573 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.722595 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.722868 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.722723 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.722913 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.723150 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.722229 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.723478 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.723526 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.723621 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.723681 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.723780 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.724076 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.724032 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.724136 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.724397 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.724406 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.724464 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.724464 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.724683 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.724713 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.724934 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.724950 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.725142 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.725185 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.725275 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.725321 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.725371 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.725496 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.725519 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.726186 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.726563 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.726918 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:02:38.226890746 +0000 UTC m=+21.226537631 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.726910 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.726941 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.726995 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.727292 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.727319 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.727498 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.727836 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.727675 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.728187 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.728289 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.728340 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.728599 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.728627 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.728530 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.728880 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.729115 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.729214 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.729248 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.729441 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.729584 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.729608 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.729683 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.729727 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730013 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730166 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.718201 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730368 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730261 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730401 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730478 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730523 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730562 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730636 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730678 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730696 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730729 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730737 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730798 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730831 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730909 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730950 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730982 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.731039 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.730980 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.731093 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.731164 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.731251 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.731271 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.731285 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.731310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.731450 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.731488 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.732723 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.732749 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.731706 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.732796 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.732822 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.732843 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.732791 4764 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.732934 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733195 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.733342 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:38.232842482 +0000 UTC m=+21.232489387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.732355 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.732393 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.732367 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.732440 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733667 4764 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733687 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733701 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733714 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733727 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733740 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733751 4764 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733761 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733770 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733782 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733873 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733885 4764 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733898 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733910 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733923 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733970 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733984 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.733996 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734010 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734189 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734206 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734220 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734231 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734241 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734597 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734610 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734619 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734628 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734636 4764 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734709 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734721 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734730 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734738 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734747 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734755 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734766 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734862 4764 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734871 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734879 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734889 4764 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734899 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.734908 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.735010 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.735123 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.735139 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.735174 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:38.235165105 +0000 UTC m=+21.234811940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.735189 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.735333 4764 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.735344 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.735354 4764 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.735362 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.735422 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.735433 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.735442 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.735324 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.736658 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.736687 4764 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.736703 4764 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.736716 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.736733 4764 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.736746 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.736760 4764 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.737322 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738246 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738270 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738282 4764 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738297 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738311 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738323 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738336 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738348 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738360 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738373 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738384 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738395 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738406 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738417 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738429 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738442 4764 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738455 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738466 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738477 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738488 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738501 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738512 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738523 4764 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738534 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738547 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738580 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738595 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738606 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738618 4764 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738652 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738664 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738678 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738691 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738702 4764 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738714 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738728 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738740 4764 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738751 4764 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738762 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738779 4764 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738791 4764 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738802 4764 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738813 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738824 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738834 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738846 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738857 4764 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738868 4764 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738879 4764 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738889 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738901 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738912 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738923 4764 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738933 4764 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738943 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738952 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738962 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738973 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738984 4764 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.738995 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.739007 4764 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.739017 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.739028 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.739039 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.739067 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.739078 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.739090 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.739101 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.739112 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.739122 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.745090 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.746329 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.746927 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.748714 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.749856 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.749865 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.749994 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.750184 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.752825 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.754134 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.754176 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.754193 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.754266 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:38.254244871 +0000 UTC m=+21.253891706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.754303 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.754400 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.754462 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.754891 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.755002 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.755163 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.756075 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.757284 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.757364 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.757438 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.757674 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.757762 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.759412 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.760008 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.762352 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.763339 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.765326 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.766213 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.767024 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.767077 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.767094 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.767157 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:38.267137645 +0000 UTC m=+21.266784570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.768454 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.769095 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.769200 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.770011 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.770345 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.770836 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.771504 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.772477 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.772566 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.773081 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.773643 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.774528 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.774988 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.775529 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.776306 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.776623 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.777022 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.777093 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.777091 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.777201 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.777224 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.777767 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.777795 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.777862 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.777957 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.778159 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.779407 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.779749 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.779967 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.779986 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.780213 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.780296 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.780676 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.780847 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.780962 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.780911 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.781067 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.781137 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.781213 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.781281 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.781365 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.781406 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.781710 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.782931 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.783768 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.784510 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.785128 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.786865 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.787519 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.788986 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.789704 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.790957 4764 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.791158 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.791531 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.793486 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.793981 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.795000 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.796606 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.797288 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.797975 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.798394 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.799185 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.800507 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.801089 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.801221 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.802762 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.803524 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.803569 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.804327 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.805621 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.806277 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.807991 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.808556 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.809742 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.810336 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.811106 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.811378 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.812582 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.813244 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.813814 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.822704 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.834424 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.836915 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.838682 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32" exitCode=255 Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.838773 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32"} Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.839400 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.839552 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.839913 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 16:02:37 crc kubenswrapper[4764]: E1001 16:02:37.843297 4764 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.843461 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.843540 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844257 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844512 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844538 4764 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844552 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844565 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844577 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844587 4764 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844599 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844609 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844619 4764 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844627 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844636 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844644 4764 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844652 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844661 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.844669 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846551 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846574 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846585 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846597 4764 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846608 4764 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846620 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846631 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846642 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846652 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846735 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846744 4764 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846752 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846761 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846769 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846777 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846785 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846793 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846801 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846809 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846818 4764 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846826 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846833 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846842 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846852 4764 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846860 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846869 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846880 4764 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846891 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846902 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846911 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846919 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846927 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846935 4764 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846942 4764 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846950 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846962 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846973 4764 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846985 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.846995 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.847117 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.847150 4764 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.849130 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.847163 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.849186 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.849909 4764 scope.go:117] "RemoveContainer" containerID="078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.853189 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.862379 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.871993 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.881671 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.892308 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.900174 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.912318 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.922199 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.932014 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.942496 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.951393 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.985123 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 16:02:37 crc kubenswrapper[4764]: I1001 16:02:37.993729 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 16:02:38 crc kubenswrapper[4764]: W1001 16:02:37.999936 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-e59aa83927c2eb82bd58aefb14890248a1923d2a44d8379ef42bbaad8d95baaa WatchSource:0}: Error finding container e59aa83927c2eb82bd58aefb14890248a1923d2a44d8379ef42bbaad8d95baaa: Status 404 returned error can't find the container with id e59aa83927c2eb82bd58aefb14890248a1923d2a44d8379ef42bbaad8d95baaa Oct 01 16:02:38 crc kubenswrapper[4764]: W1001 16:02:38.005580 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-374031d2c02a3a9fe8ec72a3fa302a77e78d27c4614696c7bf5540968e5b828f WatchSource:0}: Error finding container 374031d2c02a3a9fe8ec72a3fa302a77e78d27c4614696c7bf5540968e5b828f: Status 404 returned error can't find the container with id 374031d2c02a3a9fe8ec72a3fa302a77e78d27c4614696c7bf5540968e5b828f Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.007984 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 16:02:38 crc kubenswrapper[4764]: W1001 16:02:38.022497 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-a37e5b038046d6e5ba490671cc6e9f842fdd51bc95dc0fbe50e85d0c6d58d497 WatchSource:0}: Error finding container a37e5b038046d6e5ba490671cc6e9f842fdd51bc95dc0fbe50e85d0c6d58d497: Status 404 returned error can't find the container with id a37e5b038046d6e5ba490671cc6e9f842fdd51bc95dc0fbe50e85d0c6d58d497 Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.251473 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.251600 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.251640 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:38 crc kubenswrapper[4764]: E1001 16:02:38.251672 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:02:39.251637753 +0000 UTC m=+22.251284608 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:02:38 crc kubenswrapper[4764]: E1001 16:02:38.251732 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:02:38 crc kubenswrapper[4764]: E1001 16:02:38.251817 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:39.251793037 +0000 UTC m=+22.251440042 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:02:38 crc kubenswrapper[4764]: E1001 16:02:38.251906 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:02:38 crc kubenswrapper[4764]: E1001 16:02:38.252090 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:39.252034703 +0000 UTC m=+22.251681538 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.352421 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.352494 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:38 crc kubenswrapper[4764]: E1001 16:02:38.352561 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:02:38 crc kubenswrapper[4764]: E1001 16:02:38.352585 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:02:38 crc kubenswrapper[4764]: E1001 16:02:38.352597 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:38 crc kubenswrapper[4764]: E1001 16:02:38.352650 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:39.35263394 +0000 UTC m=+22.352280775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:38 crc kubenswrapper[4764]: E1001 16:02:38.352664 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:02:38 crc kubenswrapper[4764]: E1001 16:02:38.352702 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:02:38 crc kubenswrapper[4764]: E1001 16:02:38.352715 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:38 crc kubenswrapper[4764]: E1001 16:02:38.352777 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:39.352759824 +0000 UTC m=+22.352406709 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.842488 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4"} Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.842539 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e59aa83927c2eb82bd58aefb14890248a1923d2a44d8379ef42bbaad8d95baaa"} Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.844982 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.850164 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87"} Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.850913 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.852459 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a37e5b038046d6e5ba490671cc6e9f842fdd51bc95dc0fbe50e85d0c6d58d497"} Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.855177 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab"} Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.855209 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925"} Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.855224 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"374031d2c02a3a9fe8ec72a3fa302a77e78d27c4614696c7bf5540968e5b828f"} Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.863522 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:38Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.879266 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:38Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.905707 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:38Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.920987 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:38Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.946780 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:38Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.963761 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:38Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.976305 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:38Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:38 crc kubenswrapper[4764]: I1001 16:02:38.989592 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:38Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.001693 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:38Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.013012 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.024077 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.035226 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.045604 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.059003 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.071156 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.084030 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.258225 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.258345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.258360 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:02:41.258334941 +0000 UTC m=+24.257981796 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.258400 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.258502 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.258578 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:41.258558526 +0000 UTC m=+24.258205381 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.258733 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.258926 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:41.258895764 +0000 UTC m=+24.258542799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.359314 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.359405 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.359527 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.359566 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.359578 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.359589 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.359619 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.359638 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.359639 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:41.359618265 +0000 UTC m=+24.359265100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.359744 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:41.359714678 +0000 UTC m=+24.359361603 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.721366 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.721406 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.721448 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.721519 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.721627 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:02:39 crc kubenswrapper[4764]: E1001 16:02:39.721769 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.724892 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.725396 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.726673 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.727315 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.727875 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.729103 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.729672 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.730904 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.731502 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.732491 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 01 16:02:39 crc kubenswrapper[4764]: I1001 16:02:39.733164 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 01 16:02:40 crc kubenswrapper[4764]: I1001 16:02:40.861112 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b"} Oct 01 16:02:40 crc kubenswrapper[4764]: I1001 16:02:40.878093 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:40 crc kubenswrapper[4764]: I1001 16:02:40.893655 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:40 crc kubenswrapper[4764]: I1001 16:02:40.916693 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:40 crc kubenswrapper[4764]: I1001 16:02:40.930347 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:40 crc kubenswrapper[4764]: I1001 16:02:40.943230 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:40 crc kubenswrapper[4764]: I1001 16:02:40.961588 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:40 crc kubenswrapper[4764]: I1001 16:02:40.974548 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:40 crc kubenswrapper[4764]: I1001 16:02:40.989104 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.276580 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.276691 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.276799 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:02:45.276770552 +0000 UTC m=+28.276417397 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.276849 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.276888 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.276923 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:45.276902015 +0000 UTC m=+28.276548890 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.276994 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.277114 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:45.277097419 +0000 UTC m=+28.276744354 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.377936 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.378106 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.378171 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.378206 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.378223 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.378244 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.378267 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.378281 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.378285 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:45.37826775 +0000 UTC m=+28.377914595 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.378339 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:45.378325151 +0000 UTC m=+28.377972006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.721667 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.721696 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.721687 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.721848 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.721970 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:02:41 crc kubenswrapper[4764]: E1001 16:02:41.722063 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.755173 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.765501 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.768199 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.769619 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.783654 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.796562 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.808449 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.824221 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.843346 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.872952 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.892224 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.911988 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.932684 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.948876 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.970922 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:41 crc kubenswrapper[4764]: I1001 16:02:41.994649 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.014911 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.027410 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.037593 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.056458 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.444377 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2bzj9"] Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.444639 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2bzj9" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.446404 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.446735 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.448546 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.450258 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-zf6qx"] Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.450671 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:42 crc kubenswrapper[4764]: W1001 16:02:42.452219 4764 reflector.go:561] object-"openshift-machine-config-operator"/"proxy-tls": failed to list *v1.Secret: secrets "proxy-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-config-operator": no relationship found between node 'crc' and this object Oct 01 16:02:42 crc kubenswrapper[4764]: E1001 16:02:42.452261 4764 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-config-operator\"/\"proxy-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"proxy-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-config-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.452518 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.452800 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.453168 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.454924 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.467026 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.491291 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.505506 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.516789 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.528302 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.540732 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.552636 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.569619 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.588369 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2068a381-c49b-41a4-bd0d-8c525f9b30d0-mcd-auth-proxy-config\") pod \"machine-config-daemon-zf6qx\" (UID: \"2068a381-c49b-41a4-bd0d-8c525f9b30d0\") " pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.588406 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2068a381-c49b-41a4-bd0d-8c525f9b30d0-proxy-tls\") pod \"machine-config-daemon-zf6qx\" (UID: \"2068a381-c49b-41a4-bd0d-8c525f9b30d0\") " pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.588427 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/77d0b256-53a7-44ab-aee2-904dd15bfa80-hosts-file\") pod \"node-resolver-2bzj9\" (UID: \"77d0b256-53a7-44ab-aee2-904dd15bfa80\") " pod="openshift-dns/node-resolver-2bzj9" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.588529 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2068a381-c49b-41a4-bd0d-8c525f9b30d0-rootfs\") pod \"machine-config-daemon-zf6qx\" (UID: \"2068a381-c49b-41a4-bd0d-8c525f9b30d0\") " pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.588638 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x65tx\" (UniqueName: \"kubernetes.io/projected/2068a381-c49b-41a4-bd0d-8c525f9b30d0-kube-api-access-x65tx\") pod \"machine-config-daemon-zf6qx\" (UID: \"2068a381-c49b-41a4-bd0d-8c525f9b30d0\") " pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.588716 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slx78\" (UniqueName: \"kubernetes.io/projected/77d0b256-53a7-44ab-aee2-904dd15bfa80-kube-api-access-slx78\") pod \"node-resolver-2bzj9\" (UID: \"77d0b256-53a7-44ab-aee2-904dd15bfa80\") " pod="openshift-dns/node-resolver-2bzj9" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.591916 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.606309 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.621290 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.638250 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.654929 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.672175 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.689615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2068a381-c49b-41a4-bd0d-8c525f9b30d0-mcd-auth-proxy-config\") pod \"machine-config-daemon-zf6qx\" (UID: \"2068a381-c49b-41a4-bd0d-8c525f9b30d0\") " pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.689649 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2068a381-c49b-41a4-bd0d-8c525f9b30d0-proxy-tls\") pod \"machine-config-daemon-zf6qx\" (UID: \"2068a381-c49b-41a4-bd0d-8c525f9b30d0\") " pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.689668 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/77d0b256-53a7-44ab-aee2-904dd15bfa80-hosts-file\") pod \"node-resolver-2bzj9\" (UID: \"77d0b256-53a7-44ab-aee2-904dd15bfa80\") " pod="openshift-dns/node-resolver-2bzj9" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.689705 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2068a381-c49b-41a4-bd0d-8c525f9b30d0-rootfs\") pod \"machine-config-daemon-zf6qx\" (UID: \"2068a381-c49b-41a4-bd0d-8c525f9b30d0\") " pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.689721 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x65tx\" (UniqueName: \"kubernetes.io/projected/2068a381-c49b-41a4-bd0d-8c525f9b30d0-kube-api-access-x65tx\") pod \"machine-config-daemon-zf6qx\" (UID: \"2068a381-c49b-41a4-bd0d-8c525f9b30d0\") " pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.689745 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slx78\" (UniqueName: \"kubernetes.io/projected/77d0b256-53a7-44ab-aee2-904dd15bfa80-kube-api-access-slx78\") pod \"node-resolver-2bzj9\" (UID: \"77d0b256-53a7-44ab-aee2-904dd15bfa80\") " pod="openshift-dns/node-resolver-2bzj9" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.689904 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2068a381-c49b-41a4-bd0d-8c525f9b30d0-rootfs\") pod \"machine-config-daemon-zf6qx\" (UID: \"2068a381-c49b-41a4-bd0d-8c525f9b30d0\") " pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.689961 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/77d0b256-53a7-44ab-aee2-904dd15bfa80-hosts-file\") pod \"node-resolver-2bzj9\" (UID: \"77d0b256-53a7-44ab-aee2-904dd15bfa80\") " pod="openshift-dns/node-resolver-2bzj9" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.690333 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2068a381-c49b-41a4-bd0d-8c525f9b30d0-mcd-auth-proxy-config\") pod \"machine-config-daemon-zf6qx\" (UID: \"2068a381-c49b-41a4-bd0d-8c525f9b30d0\") " pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.699107 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.717598 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slx78\" (UniqueName: \"kubernetes.io/projected/77d0b256-53a7-44ab-aee2-904dd15bfa80-kube-api-access-slx78\") pod \"node-resolver-2bzj9\" (UID: \"77d0b256-53a7-44ab-aee2-904dd15bfa80\") " pod="openshift-dns/node-resolver-2bzj9" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.718581 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x65tx\" (UniqueName: \"kubernetes.io/projected/2068a381-c49b-41a4-bd0d-8c525f9b30d0-kube-api-access-x65tx\") pod \"machine-config-daemon-zf6qx\" (UID: \"2068a381-c49b-41a4-bd0d-8c525f9b30d0\") " pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.743328 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.760130 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2bzj9" Oct 01 16:02:42 crc kubenswrapper[4764]: W1001 16:02:42.774258 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77d0b256_53a7_44ab_aee2_904dd15bfa80.slice/crio-3a94e9f6fa785ca5b4f25f8b4df608b64ee5c9238a6b3a8b86ba04827569ccd1 WatchSource:0}: Error finding container 3a94e9f6fa785ca5b4f25f8b4df608b64ee5c9238a6b3a8b86ba04827569ccd1: Status 404 returned error can't find the container with id 3a94e9f6fa785ca5b4f25f8b4df608b64ee5c9238a6b3a8b86ba04827569ccd1 Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.791381 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.809027 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.828468 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.834947 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jssc8"] Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.835881 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-ks425"] Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.836095 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.836337 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.840263 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.840549 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.841457 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.841714 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.841965 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.847032 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.847989 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.857802 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.869410 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2bzj9" event={"ID":"77d0b256-53a7-44ab-aee2-904dd15bfa80","Type":"ContainerStarted","Data":"3a94e9f6fa785ca5b4f25f8b4df608b64ee5c9238a6b3a8b86ba04827569ccd1"} Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.872809 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.886147 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.900225 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.922358 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.940789 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.954475 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.974125 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991230 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-var-lib-cni-bin\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991267 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ff6bf25-73d6-4e89-b803-12502064e5f4-os-release\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991289 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ff6bf25-73d6-4e89-b803-12502064e5f4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991310 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ff6bf25-73d6-4e89-b803-12502064e5f4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991328 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-system-cni-dir\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991345 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-etc-kubernetes\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991368 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ff6bf25-73d6-4e89-b803-12502064e5f4-system-cni-dir\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991381 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-var-lib-cni-multus\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991397 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-run-multus-certs\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991411 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ff6bf25-73d6-4e89-b803-12502064e5f4-cnibin\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-run-k8s-cni-cncf-io\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991448 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ff6bf25-73d6-4e89-b803-12502064e5f4-cni-binary-copy\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991471 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5499b593-79e4-408e-a32b-9e132d3a0de7-multus-daemon-config\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991496 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-hostroot\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991511 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-cnibin\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991525 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-multus-socket-dir-parent\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991542 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-run-netns\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991556 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bdwp\" (UniqueName: \"kubernetes.io/projected/8ff6bf25-73d6-4e89-b803-12502064e5f4-kube-api-access-9bdwp\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991571 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-multus-cni-dir\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991587 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-multus-conf-dir\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991601 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5499b593-79e4-408e-a32b-9e132d3a0de7-cni-binary-copy\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991615 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-var-lib-kubelet\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991628 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-os-release\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:42 crc kubenswrapper[4764]: I1001 16:02:42.991642 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8ktn\" (UniqueName: \"kubernetes.io/projected/5499b593-79e4-408e-a32b-9e132d3a0de7-kube-api-access-n8ktn\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.001133 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:42Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.012751 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.024908 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.035660 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.054843 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.067529 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.078600 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092389 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bdwp\" (UniqueName: \"kubernetes.io/projected/8ff6bf25-73d6-4e89-b803-12502064e5f4-kube-api-access-9bdwp\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092453 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-multus-cni-dir\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092488 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5499b593-79e4-408e-a32b-9e132d3a0de7-cni-binary-copy\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092518 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-var-lib-kubelet\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092547 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-multus-conf-dir\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092594 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-os-release\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092624 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8ktn\" (UniqueName: \"kubernetes.io/projected/5499b593-79e4-408e-a32b-9e132d3a0de7-kube-api-access-n8ktn\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092658 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-var-lib-cni-bin\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092685 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-etc-kubernetes\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092719 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ff6bf25-73d6-4e89-b803-12502064e5f4-os-release\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092751 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ff6bf25-73d6-4e89-b803-12502064e5f4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092782 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ff6bf25-73d6-4e89-b803-12502064e5f4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092842 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-system-cni-dir\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092922 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ff6bf25-73d6-4e89-b803-12502064e5f4-system-cni-dir\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092943 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-var-lib-cni-multus\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092966 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ff6bf25-73d6-4e89-b803-12502064e5f4-cnibin\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.092987 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-run-k8s-cni-cncf-io\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-run-multus-certs\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093039 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ff6bf25-73d6-4e89-b803-12502064e5f4-cni-binary-copy\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093090 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5499b593-79e4-408e-a32b-9e132d3a0de7-multus-daemon-config\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093126 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-cnibin\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093154 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-multus-socket-dir-parent\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093183 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-run-netns\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-hostroot\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093291 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-hostroot\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093434 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-run-multus-certs\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093533 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-run-netns\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093533 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-cnibin\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093574 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-system-cni-dir\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093606 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ff6bf25-73d6-4e89-b803-12502064e5f4-cnibin\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093601 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ff6bf25-73d6-4e89-b803-12502064e5f4-system-cni-dir\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093628 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-etc-kubernetes\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093640 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-run-k8s-cni-cncf-io\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093627 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-var-lib-cni-bin\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093655 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-var-lib-cni-multus\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093674 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-multus-socket-dir-parent\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093697 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-host-var-lib-kubelet\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093684 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-multus-conf-dir\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093756 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-multus-cni-dir\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093856 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5499b593-79e4-408e-a32b-9e132d3a0de7-os-release\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093856 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ff6bf25-73d6-4e89-b803-12502064e5f4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.093891 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ff6bf25-73d6-4e89-b803-12502064e5f4-os-release\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.094356 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5499b593-79e4-408e-a32b-9e132d3a0de7-cni-binary-copy\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.094355 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5499b593-79e4-408e-a32b-9e132d3a0de7-multus-daemon-config\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.094430 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ff6bf25-73d6-4e89-b803-12502064e5f4-cni-binary-copy\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.094557 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ff6bf25-73d6-4e89-b803-12502064e5f4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.108897 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8ktn\" (UniqueName: \"kubernetes.io/projected/5499b593-79e4-408e-a32b-9e132d3a0de7-kube-api-access-n8ktn\") pod \"multus-ks425\" (UID: \"5499b593-79e4-408e-a32b-9e132d3a0de7\") " pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.109993 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bdwp\" (UniqueName: \"kubernetes.io/projected/8ff6bf25-73d6-4e89-b803-12502064e5f4-kube-api-access-9bdwp\") pod \"multus-additional-cni-plugins-jssc8\" (UID: \"8ff6bf25-73d6-4e89-b803-12502064e5f4\") " pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.147410 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jssc8" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.154264 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ks425" Oct 01 16:02:43 crc kubenswrapper[4764]: W1001 16:02:43.160138 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ff6bf25_73d6_4e89_b803_12502064e5f4.slice/crio-1fcf782bb01435f62a5b20e039677634fac281977bd0b78ef4f22691ac65609f WatchSource:0}: Error finding container 1fcf782bb01435f62a5b20e039677634fac281977bd0b78ef4f22691ac65609f: Status 404 returned error can't find the container with id 1fcf782bb01435f62a5b20e039677634fac281977bd0b78ef4f22691ac65609f Oct 01 16:02:43 crc kubenswrapper[4764]: W1001 16:02:43.181838 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5499b593_79e4_408e_a32b_9e132d3a0de7.slice/crio-a7bc061307df6b7731aeaa3f5262d63c8c8d49277a4a2c6c5bfeae617a0e897b WatchSource:0}: Error finding container a7bc061307df6b7731aeaa3f5262d63c8c8d49277a4a2c6c5bfeae617a0e897b: Status 404 returned error can't find the container with id a7bc061307df6b7731aeaa3f5262d63c8c8d49277a4a2c6c5bfeae617a0e897b Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.221283 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fngxf"] Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.222295 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.224616 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.225011 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.225089 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.225125 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.225250 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.225300 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.225543 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.249854 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.263809 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.278527 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.291026 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.302914 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.321628 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.332256 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.346871 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.358457 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.370841 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.383299 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397031 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-systemd\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-systemd-units\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397313 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-node-log\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397352 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397418 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-slash\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397520 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zbvx\" (UniqueName: \"kubernetes.io/projected/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-kube-api-access-5zbvx\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397548 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-ovn\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397571 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-var-lib-openvswitch\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397613 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-kubelet\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397639 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-etc-openvswitch\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397671 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovnkube-config\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397692 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-openvswitch\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397710 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-cni-bin\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397742 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-env-overrides\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397763 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-cni-netd\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-log-socket\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397814 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovnkube-script-lib\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397852 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovn-node-metrics-cert\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.397877 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-run-netns\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.399989 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.409728 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.411664 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.411708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.411718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.411778 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.412861 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.421255 4764 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.421632 4764 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.422742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.422769 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.422783 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.422799 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.422811 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:43Z","lastTransitionTime":"2025-10-01T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.433525 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: E1001 16:02:43.446104 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.450578 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.450700 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.450765 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.450824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.450894 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:43Z","lastTransitionTime":"2025-10-01T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:43 crc kubenswrapper[4764]: E1001 16:02:43.464797 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.469324 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.469378 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.469395 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.469416 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.469431 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:43Z","lastTransitionTime":"2025-10-01T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:43 crc kubenswrapper[4764]: E1001 16:02:43.481690 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.484851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.484889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.484901 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.484917 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.484930 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:43Z","lastTransitionTime":"2025-10-01T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:43 crc kubenswrapper[4764]: E1001 16:02:43.496401 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498700 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-systemd-units\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498740 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-node-log\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498763 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498783 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-slash\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498803 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498827 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zbvx\" (UniqueName: \"kubernetes.io/projected/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-kube-api-access-5zbvx\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498848 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-ovn\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498870 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-var-lib-openvswitch\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498880 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-slash\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498903 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-kubelet\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498924 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-etc-openvswitch\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498936 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-node-log\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498953 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovnkube-config\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-openvswitch\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498993 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-cni-bin\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-env-overrides\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499025 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-var-lib-openvswitch\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-cni-netd\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499076 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-log-socket\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499093 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499096 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovnkube-script-lib\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499147 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovn-node-metrics-cert\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499179 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-run-netns\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499228 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-systemd\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499306 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-systemd\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.498841 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-systemd-units\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499571 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-ovn\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-openvswitch\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499640 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-etc-openvswitch\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499691 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-kubelet\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499735 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-run-netns\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499758 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovnkube-script-lib\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499805 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-cni-netd\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499876 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-log-socket\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.499985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.500004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.500014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.500028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.500038 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:43Z","lastTransitionTime":"2025-10-01T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.500305 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovnkube-config\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.500367 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-env-overrides\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.500391 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-cni-bin\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.505207 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovn-node-metrics-cert\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: E1001 16:02:43.510844 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: E1001 16:02:43.511135 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.512463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.512497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.512512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.512533 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.512548 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:43Z","lastTransitionTime":"2025-10-01T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.514331 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zbvx\" (UniqueName: \"kubernetes.io/projected/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-kube-api-access-5zbvx\") pod \"ovnkube-node-fngxf\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.575427 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:43 crc kubenswrapper[4764]: W1001 16:02:43.587838 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe0fc1af_28a8_48cd_ba84_954c8e7de3e8.slice/crio-25941f43ecde66b19a0c769d012677267d4d81919e7d78ec24241ede26c2549b WatchSource:0}: Error finding container 25941f43ecde66b19a0c769d012677267d4d81919e7d78ec24241ede26c2549b: Status 404 returned error can't find the container with id 25941f43ecde66b19a0c769d012677267d4d81919e7d78ec24241ede26c2549b Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.618118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.618177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.618189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.618216 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.618291 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:43Z","lastTransitionTime":"2025-10-01T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.649883 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.652626 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2068a381-c49b-41a4-bd0d-8c525f9b30d0-proxy-tls\") pod \"machine-config-daemon-zf6qx\" (UID: \"2068a381-c49b-41a4-bd0d-8c525f9b30d0\") " pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.664171 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.721153 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.721187 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.721159 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:43 crc kubenswrapper[4764]: E1001 16:02:43.721264 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:02:43 crc kubenswrapper[4764]: E1001 16:02:43.721306 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:02:43 crc kubenswrapper[4764]: E1001 16:02:43.721360 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.721934 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.721962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.721973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.721985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.721997 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:43Z","lastTransitionTime":"2025-10-01T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:43 crc kubenswrapper[4764]: W1001 16:02:43.755506 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2068a381_c49b_41a4_bd0d_8c525f9b30d0.slice/crio-f56bbfae7b1c953fe2d3314bca9f046ef67ff133971634b97fa49522e6c88a41 WatchSource:0}: Error finding container f56bbfae7b1c953fe2d3314bca9f046ef67ff133971634b97fa49522e6c88a41: Status 404 returned error can't find the container with id f56bbfae7b1c953fe2d3314bca9f046ef67ff133971634b97fa49522e6c88a41 Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.824605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.824636 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.824646 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.824663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.824675 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:43Z","lastTransitionTime":"2025-10-01T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.872999 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"f56bbfae7b1c953fe2d3314bca9f046ef67ff133971634b97fa49522e6c88a41"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.874178 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerID="021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24" exitCode=0 Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.874260 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.874305 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerStarted","Data":"25941f43ecde66b19a0c769d012677267d4d81919e7d78ec24241ede26c2549b"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.875588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks425" event={"ID":"5499b593-79e4-408e-a32b-9e132d3a0de7","Type":"ContainerStarted","Data":"c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.875623 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks425" event={"ID":"5499b593-79e4-408e-a32b-9e132d3a0de7","Type":"ContainerStarted","Data":"a7bc061307df6b7731aeaa3f5262d63c8c8d49277a4a2c6c5bfeae617a0e897b"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.877999 4764 generic.go:334] "Generic (PLEG): container finished" podID="8ff6bf25-73d6-4e89-b803-12502064e5f4" containerID="c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae" exitCode=0 Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.878079 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" event={"ID":"8ff6bf25-73d6-4e89-b803-12502064e5f4","Type":"ContainerDied","Data":"c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.878104 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" event={"ID":"8ff6bf25-73d6-4e89-b803-12502064e5f4","Type":"ContainerStarted","Data":"1fcf782bb01435f62a5b20e039677634fac281977bd0b78ef4f22691ac65609f"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.880159 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2bzj9" event={"ID":"77d0b256-53a7-44ab-aee2-904dd15bfa80","Type":"ContainerStarted","Data":"a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.888600 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.902483 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.924119 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.927327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.927367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.927380 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.927399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.927412 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:43Z","lastTransitionTime":"2025-10-01T16:02:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.937538 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.955768 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.970552 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:43 crc kubenswrapper[4764]: I1001 16:02:43.985470 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:43Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.005775 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.015952 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.030493 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.030538 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.030553 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.030526 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.030573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.030720 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:44Z","lastTransitionTime":"2025-10-01T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.043174 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.055159 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.069381 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.096439 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.118139 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.132100 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.132968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.132999 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.133010 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.133025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.133035 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:44Z","lastTransitionTime":"2025-10-01T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.145875 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.157680 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.171785 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.187425 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.200503 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.212259 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.225061 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.234945 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.234975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.234984 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.234997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.235008 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:44Z","lastTransitionTime":"2025-10-01T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.243647 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.253977 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.264370 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.280960 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.297458 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.337641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.337669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.337676 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.337692 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.337700 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:44Z","lastTransitionTime":"2025-10-01T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.440445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.440485 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.440499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.440513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.440522 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:44Z","lastTransitionTime":"2025-10-01T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.542962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.542999 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.543008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.543021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.543031 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:44Z","lastTransitionTime":"2025-10-01T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.645288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.645330 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.645340 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.645356 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.645366 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:44Z","lastTransitionTime":"2025-10-01T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.748141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.748186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.748206 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.748226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.748239 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:44Z","lastTransitionTime":"2025-10-01T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.851013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.851039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.851064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.851081 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.851091 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:44Z","lastTransitionTime":"2025-10-01T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.886079 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.886354 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.890082 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerStarted","Data":"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.890946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerStarted","Data":"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.891091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerStarted","Data":"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.891221 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerStarted","Data":"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.891283 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerStarted","Data":"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.891362 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerStarted","Data":"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.891817 4764 generic.go:334] "Generic (PLEG): container finished" podID="8ff6bf25-73d6-4e89-b803-12502064e5f4" containerID="30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3" exitCode=0 Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.891904 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" event={"ID":"8ff6bf25-73d6-4e89-b803-12502064e5f4","Type":"ContainerDied","Data":"30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.906974 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.922348 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.934306 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.947854 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.953260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.953408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.953436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.953450 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.953461 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:44Z","lastTransitionTime":"2025-10-01T16:02:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.967816 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.980916 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:44 crc kubenswrapper[4764]: I1001 16:02:44.993802 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:44Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.009255 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.020875 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.033990 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.055762 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.059401 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.059447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.059459 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.059478 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.059492 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:45Z","lastTransitionTime":"2025-10-01T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.069114 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.082501 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.096095 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.109211 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.122185 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.136187 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.145568 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.156288 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.161578 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.161618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.161627 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.161644 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.161653 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:45Z","lastTransitionTime":"2025-10-01T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.167968 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.178630 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.187072 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.203363 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.213321 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.234816 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.264696 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.265058 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.265069 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.265085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.265096 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:45Z","lastTransitionTime":"2025-10-01T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.278068 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.295598 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.308804 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.328423 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.328547 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.328607 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:02:53.328579995 +0000 UTC m=+36.328226870 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.328677 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.328715 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.328767 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.328788 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:53.328753018 +0000 UTC m=+36.328399933 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.328807 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:53.328798709 +0000 UTC m=+36.328445634 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.367894 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.367934 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.367968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.367985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.368004 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:45Z","lastTransitionTime":"2025-10-01T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.429958 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.430018 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.430156 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.430191 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.430203 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.430168 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.430250 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:53.430236146 +0000 UTC m=+36.429882981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.430257 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.430269 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.430319 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 16:02:53.430292527 +0000 UTC m=+36.429939362 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.445109 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vq8z5"] Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.445426 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vq8z5" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.446853 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.447192 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.447310 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.447415 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.469681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.469713 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.469723 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.469736 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.469745 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:45Z","lastTransitionTime":"2025-10-01T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.473712 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.484183 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.494569 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.512079 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.523136 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.531027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eff65edf-094d-4261-838e-7ae318e5c6fc-serviceca\") pod \"node-ca-vq8z5\" (UID: \"eff65edf-094d-4261-838e-7ae318e5c6fc\") " pod="openshift-image-registry/node-ca-vq8z5" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.531176 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eff65edf-094d-4261-838e-7ae318e5c6fc-host\") pod \"node-ca-vq8z5\" (UID: \"eff65edf-094d-4261-838e-7ae318e5c6fc\") " pod="openshift-image-registry/node-ca-vq8z5" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.531220 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pfz9\" (UniqueName: \"kubernetes.io/projected/eff65edf-094d-4261-838e-7ae318e5c6fc-kube-api-access-6pfz9\") pod \"node-ca-vq8z5\" (UID: \"eff65edf-094d-4261-838e-7ae318e5c6fc\") " pod="openshift-image-registry/node-ca-vq8z5" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.545531 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.572115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.572151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.572158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.572172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.572182 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:45Z","lastTransitionTime":"2025-10-01T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.584869 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.627329 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.631801 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eff65edf-094d-4261-838e-7ae318e5c6fc-host\") pod \"node-ca-vq8z5\" (UID: \"eff65edf-094d-4261-838e-7ae318e5c6fc\") " pod="openshift-image-registry/node-ca-vq8z5" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.631835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pfz9\" (UniqueName: \"kubernetes.io/projected/eff65edf-094d-4261-838e-7ae318e5c6fc-kube-api-access-6pfz9\") pod \"node-ca-vq8z5\" (UID: \"eff65edf-094d-4261-838e-7ae318e5c6fc\") " pod="openshift-image-registry/node-ca-vq8z5" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.631869 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eff65edf-094d-4261-838e-7ae318e5c6fc-serviceca\") pod \"node-ca-vq8z5\" (UID: \"eff65edf-094d-4261-838e-7ae318e5c6fc\") " pod="openshift-image-registry/node-ca-vq8z5" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.632748 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eff65edf-094d-4261-838e-7ae318e5c6fc-serviceca\") pod \"node-ca-vq8z5\" (UID: \"eff65edf-094d-4261-838e-7ae318e5c6fc\") " pod="openshift-image-registry/node-ca-vq8z5" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.632801 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eff65edf-094d-4261-838e-7ae318e5c6fc-host\") pod \"node-ca-vq8z5\" (UID: \"eff65edf-094d-4261-838e-7ae318e5c6fc\") " pod="openshift-image-registry/node-ca-vq8z5" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.672823 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pfz9\" (UniqueName: \"kubernetes.io/projected/eff65edf-094d-4261-838e-7ae318e5c6fc-kube-api-access-6pfz9\") pod \"node-ca-vq8z5\" (UID: \"eff65edf-094d-4261-838e-7ae318e5c6fc\") " pod="openshift-image-registry/node-ca-vq8z5" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.675184 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.675220 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.675228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.675244 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.675254 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:45Z","lastTransitionTime":"2025-10-01T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.684991 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.721417 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.721483 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.721439 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.721576 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.721685 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:02:45 crc kubenswrapper[4764]: E1001 16:02:45.721756 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.730974 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.765423 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.778578 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.778601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.778611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.778625 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.778635 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:45Z","lastTransitionTime":"2025-10-01T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.806117 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.845704 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.853060 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vq8z5" Oct 01 16:02:45 crc kubenswrapper[4764]: W1001 16:02:45.869338 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeff65edf_094d_4261_838e_7ae318e5c6fc.slice/crio-b32b2a07dc4a0ed74b94e96956b5e23926a46b715dd9f0ca206a5c749a5af95b WatchSource:0}: Error finding container b32b2a07dc4a0ed74b94e96956b5e23926a46b715dd9f0ca206a5c749a5af95b: Status 404 returned error can't find the container with id b32b2a07dc4a0ed74b94e96956b5e23926a46b715dd9f0ca206a5c749a5af95b Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.880763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.880802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.880812 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.880827 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.880837 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:45Z","lastTransitionTime":"2025-10-01T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.887378 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.899236 4764 generic.go:334] "Generic (PLEG): container finished" podID="8ff6bf25-73d6-4e89-b803-12502064e5f4" containerID="cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea" exitCode=0 Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.899289 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" event={"ID":"8ff6bf25-73d6-4e89-b803-12502064e5f4","Type":"ContainerDied","Data":"cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea"} Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.900789 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vq8z5" event={"ID":"eff65edf-094d-4261-838e-7ae318e5c6fc","Type":"ContainerStarted","Data":"b32b2a07dc4a0ed74b94e96956b5e23926a46b715dd9f0ca206a5c749a5af95b"} Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.924201 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.965858 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.982738 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.982776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.982785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.982799 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:45 crc kubenswrapper[4764]: I1001 16:02:45.982809 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:45Z","lastTransitionTime":"2025-10-01T16:02:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.008283 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.054469 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.087286 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.087318 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.087327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.087342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.087352 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:46Z","lastTransitionTime":"2025-10-01T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.088601 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.124837 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.172056 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.189467 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.189502 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.189513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.189529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.189540 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:46Z","lastTransitionTime":"2025-10-01T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.205368 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.246631 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.285115 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.291322 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.291355 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.291365 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.291382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.291392 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:46Z","lastTransitionTime":"2025-10-01T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.327695 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.366319 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.393835 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.393871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.393889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.393906 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.393917 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:46Z","lastTransitionTime":"2025-10-01T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.406502 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.450479 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.488525 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.496107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.496146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.496160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.496174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.496183 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:46Z","lastTransitionTime":"2025-10-01T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.525874 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.598619 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.598653 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.598661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.598674 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.598684 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:46Z","lastTransitionTime":"2025-10-01T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.701132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.701183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.701194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.701212 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.701226 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:46Z","lastTransitionTime":"2025-10-01T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.803814 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.803854 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.803864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.803880 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.803890 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:46Z","lastTransitionTime":"2025-10-01T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.906466 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.906768 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.906783 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.906798 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.906810 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:46Z","lastTransitionTime":"2025-10-01T16:02:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.909480 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerStarted","Data":"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6"} Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.917524 4764 generic.go:334] "Generic (PLEG): container finished" podID="8ff6bf25-73d6-4e89-b803-12502064e5f4" containerID="f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f" exitCode=0 Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.917646 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" event={"ID":"8ff6bf25-73d6-4e89-b803-12502064e5f4","Type":"ContainerDied","Data":"f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f"} Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.922573 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vq8z5" event={"ID":"eff65edf-094d-4261-838e-7ae318e5c6fc","Type":"ContainerStarted","Data":"422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506"} Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.931294 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.945576 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.959632 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.971642 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:46 crc kubenswrapper[4764]: I1001 16:02:46.982437 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.001368 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:46Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.010498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.010538 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.010549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.010565 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.010577 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:47Z","lastTransitionTime":"2025-10-01T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.013325 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.024786 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.044624 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.056549 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.068542 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.078532 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.090042 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.102279 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.112239 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.112269 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.112277 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.112289 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.112297 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:47Z","lastTransitionTime":"2025-10-01T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.127588 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.167547 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.207953 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.215444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.215718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.215727 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.215739 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.215747 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:47Z","lastTransitionTime":"2025-10-01T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.246398 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.284398 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.322522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.322568 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.322578 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.322593 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.322602 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:47Z","lastTransitionTime":"2025-10-01T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.331273 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.371159 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.407203 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.424848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.424889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.424903 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.424920 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.424930 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:47Z","lastTransitionTime":"2025-10-01T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.448915 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.486510 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.526816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.527374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.527545 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.527636 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.527740 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:47Z","lastTransitionTime":"2025-10-01T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.527976 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.571274 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.610403 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.630837 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.630876 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.630888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.630909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.630924 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:47Z","lastTransitionTime":"2025-10-01T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.651308 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.685671 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.722181 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:47 crc kubenswrapper[4764]: E1001 16:02:47.722300 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.722480 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.722994 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:47 crc kubenswrapper[4764]: E1001 16:02:47.722743 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:02:47 crc kubenswrapper[4764]: E1001 16:02:47.723149 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.726148 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.732800 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.732833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.732843 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.732858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.732867 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:47Z","lastTransitionTime":"2025-10-01T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.767125 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.805246 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.834730 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.834765 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.834776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.834871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.834884 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:47Z","lastTransitionTime":"2025-10-01T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.847292 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.884552 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.928412 4764 generic.go:334] "Generic (PLEG): container finished" podID="8ff6bf25-73d6-4e89-b803-12502064e5f4" containerID="32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325" exitCode=0 Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.928495 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" event={"ID":"8ff6bf25-73d6-4e89-b803-12502064e5f4","Type":"ContainerDied","Data":"32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325"} Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.929144 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.937247 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.937314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.937330 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.937349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.937361 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:47Z","lastTransitionTime":"2025-10-01T16:02:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:47 crc kubenswrapper[4764]: I1001 16:02:47.965343 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.006222 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.039692 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.039718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.039733 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.039745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.039753 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:48Z","lastTransitionTime":"2025-10-01T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.053309 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.086262 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.125447 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.142001 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.142064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.142077 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.142097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.142108 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:48Z","lastTransitionTime":"2025-10-01T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.163025 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.215858 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.244118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.244155 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.244165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.244180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.244191 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:48Z","lastTransitionTime":"2025-10-01T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.247568 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.285674 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.324882 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.346377 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.346425 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.346437 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.346454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.346465 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:48Z","lastTransitionTime":"2025-10-01T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.369917 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.406455 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.446315 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.448726 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.448757 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.448766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.448779 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.448788 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:48Z","lastTransitionTime":"2025-10-01T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.490402 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.527334 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.551498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.551539 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.551557 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.551572 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.551583 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:48Z","lastTransitionTime":"2025-10-01T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.568373 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.605597 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.649805 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.652996 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.653026 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.653034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.653068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.653081 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:48Z","lastTransitionTime":"2025-10-01T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.690742 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.730191 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.757220 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.757260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.757277 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.757297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.757312 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:48Z","lastTransitionTime":"2025-10-01T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.770949 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.816038 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.845327 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.859281 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.859327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.859340 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.859357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.859369 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:48Z","lastTransitionTime":"2025-10-01T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.883927 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.926417 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.935303 4764 generic.go:334] "Generic (PLEG): container finished" podID="8ff6bf25-73d6-4e89-b803-12502064e5f4" containerID="1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8" exitCode=0 Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.935357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" event={"ID":"8ff6bf25-73d6-4e89-b803-12502064e5f4","Type":"ContainerDied","Data":"1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8"} Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.961880 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.961948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.961969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.961999 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.962097 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:48Z","lastTransitionTime":"2025-10-01T16:02:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:48 crc kubenswrapper[4764]: I1001 16:02:48.975250 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:48Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.005553 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.055833 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.063804 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.063844 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.063856 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.063872 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.063883 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:49Z","lastTransitionTime":"2025-10-01T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.088681 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.135860 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.167978 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.171801 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.171858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.171877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.171899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.171922 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:49Z","lastTransitionTime":"2025-10-01T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.207928 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.249505 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.275149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.275201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.275213 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.275230 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.275755 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:49Z","lastTransitionTime":"2025-10-01T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.284960 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.328900 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.369339 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.377186 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.378859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.378926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.378948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.378976 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.378996 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:49Z","lastTransitionTime":"2025-10-01T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.412611 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.447217 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.481644 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.481667 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.481675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.481687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.481696 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:49Z","lastTransitionTime":"2025-10-01T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.488213 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.527901 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.572390 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.584806 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.584836 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.584845 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.584859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.584868 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:49Z","lastTransitionTime":"2025-10-01T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.609174 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.647986 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.686159 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.687363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.687392 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.687400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.687413 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.687422 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:49Z","lastTransitionTime":"2025-10-01T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.721468 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:49 crc kubenswrapper[4764]: E1001 16:02:49.721578 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.722411 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:49 crc kubenswrapper[4764]: E1001 16:02:49.722492 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.722554 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:49 crc kubenswrapper[4764]: E1001 16:02:49.722619 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.729317 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.769315 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.789430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.789474 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.789483 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.789496 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.789504 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:49Z","lastTransitionTime":"2025-10-01T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.808960 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.847750 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.891911 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.891977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.891990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.892007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.892018 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:49Z","lastTransitionTime":"2025-10-01T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.892455 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.926272 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.941610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerStarted","Data":"04ff53dc4fdee84e73e54308b71fd02e51d828caaedc644ca3ef3e03cd6b8620"} Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.941962 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.941984 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.946539 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" event={"ID":"8ff6bf25-73d6-4e89-b803-12502064e5f4","Type":"ContainerStarted","Data":"d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425"} Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.970302 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:49Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.972697 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.979022 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.994535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.994562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.994571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.994585 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:49 crc kubenswrapper[4764]: I1001 16:02:49.994594 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:49Z","lastTransitionTime":"2025-10-01T16:02:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.003079 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.054512 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.087358 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.097500 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.097543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.097554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.097568 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.097578 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:50Z","lastTransitionTime":"2025-10-01T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.128936 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.167445 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.200950 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.201008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.201022 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.201072 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.201087 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:50Z","lastTransitionTime":"2025-10-01T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.215114 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.253783 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ff53dc4fdee84e73e54308b71fd02e51d828caaedc644ca3ef3e03cd6b8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.290414 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.303216 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.303247 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.303255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.303268 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.303278 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:50Z","lastTransitionTime":"2025-10-01T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.328315 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.363395 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.405745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.405793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.405802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.405818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.405828 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:50Z","lastTransitionTime":"2025-10-01T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.417251 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.448496 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.489916 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.509130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.509196 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.509219 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.509248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.509307 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:50Z","lastTransitionTime":"2025-10-01T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.528767 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.567849 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.607154 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.611877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.611910 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.611923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.611939 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.611949 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:50Z","lastTransitionTime":"2025-10-01T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.644076 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.685160 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.714381 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.714423 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.714434 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.714449 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.714460 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:50Z","lastTransitionTime":"2025-10-01T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.725239 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:50Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.817435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.817493 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.817510 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.817532 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.817552 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:50Z","lastTransitionTime":"2025-10-01T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.919737 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.919778 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.919789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.919806 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.919819 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:50Z","lastTransitionTime":"2025-10-01T16:02:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:50 crc kubenswrapper[4764]: I1001 16:02:50.950335 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.022104 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.022137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.022145 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.022160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.022169 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:51Z","lastTransitionTime":"2025-10-01T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.124645 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.124676 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.124684 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.124696 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.124705 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:51Z","lastTransitionTime":"2025-10-01T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.228073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.228141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.228154 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.228174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.228187 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:51Z","lastTransitionTime":"2025-10-01T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.330950 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.330998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.331011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.331030 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.331091 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:51Z","lastTransitionTime":"2025-10-01T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.433908 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.434171 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.434183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.434201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.434213 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:51Z","lastTransitionTime":"2025-10-01T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.537429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.537481 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.537497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.537525 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.537541 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:51Z","lastTransitionTime":"2025-10-01T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.640795 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.640830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.640838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.640852 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.640861 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:51Z","lastTransitionTime":"2025-10-01T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.721648 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.721703 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:51 crc kubenswrapper[4764]: E1001 16:02:51.721760 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.721773 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:51 crc kubenswrapper[4764]: E1001 16:02:51.721996 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:02:51 crc kubenswrapper[4764]: E1001 16:02:51.721907 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.742648 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.742693 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.742703 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.742719 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.742729 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:51Z","lastTransitionTime":"2025-10-01T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.845435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.845513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.845526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.845544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.845871 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:51Z","lastTransitionTime":"2025-10-01T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.948381 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.948447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.948459 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.948479 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.948499 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:51Z","lastTransitionTime":"2025-10-01T16:02:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:51 crc kubenswrapper[4764]: I1001 16:02:51.953328 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.051368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.051438 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.051448 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.051462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.051471 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:52Z","lastTransitionTime":"2025-10-01T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.153978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.154252 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.154338 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.159223 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.159497 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:52Z","lastTransitionTime":"2025-10-01T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.262029 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.262097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.262113 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.262133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.262168 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:52Z","lastTransitionTime":"2025-10-01T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.364495 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.364531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.364539 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.364552 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.364561 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:52Z","lastTransitionTime":"2025-10-01T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.467531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.467580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.467594 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.467614 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.467627 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:52Z","lastTransitionTime":"2025-10-01T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.570350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.570381 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.570389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.570401 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.570412 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:52Z","lastTransitionTime":"2025-10-01T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.673278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.673321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.673337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.673364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.673375 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:52Z","lastTransitionTime":"2025-10-01T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.776425 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.776468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.776483 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.776519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.776536 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:52Z","lastTransitionTime":"2025-10-01T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.879091 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.879130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.879140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.879155 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.879167 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:52Z","lastTransitionTime":"2025-10-01T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.957858 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/0.log" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.960600 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerID="04ff53dc4fdee84e73e54308b71fd02e51d828caaedc644ca3ef3e03cd6b8620" exitCode=1 Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.960644 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"04ff53dc4fdee84e73e54308b71fd02e51d828caaedc644ca3ef3e03cd6b8620"} Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.961393 4764 scope.go:117] "RemoveContainer" containerID="04ff53dc4fdee84e73e54308b71fd02e51d828caaedc644ca3ef3e03cd6b8620" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.975921 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:52Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.981962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.982004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.982017 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.982035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.982068 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:52Z","lastTransitionTime":"2025-10-01T16:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:52 crc kubenswrapper[4764]: I1001 16:02:52.989064 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:52Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.007412 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.028443 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.044910 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.056499 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.066909 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.080670 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.090958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.091009 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.091019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.091034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.091067 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.095217 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.115709 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.136800 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04ff53dc4fdee84e73e54308b71fd02e51d828caaedc644ca3ef3e03cd6b8620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ff53dc4fdee84e73e54308b71fd02e51d828caaedc644ca3ef3e03cd6b8620\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:52Z\\\",\\\"message\\\":\\\" 6052 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 16:02:52.306547 6052 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 16:02:52.306813 6052 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 16:02:52.306847 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 16:02:52.306852 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 16:02:52.306863 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 16:02:52.306887 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 16:02:52.306893 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 16:02:52.306903 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 16:02:52.306910 6052 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 16:02:52.306909 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 16:02:52.306901 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 16:02:52.306942 6052 factory.go:656] Stopping watch factory\\\\nI1001 16:02:52.306947 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 16:02:52.306959 6052 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.156607 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.169601 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.182528 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.191448 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.192819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.192856 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.192881 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.192903 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.192916 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.295703 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.295759 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.295776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.295796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.295808 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.397871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.397926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.397942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.397964 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.397980 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.412335 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.412457 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.412508 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:03:09.412481489 +0000 UTC m=+52.412128324 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.412547 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.412554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.412599 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:03:09.412583661 +0000 UTC m=+52.412230516 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.412689 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.412742 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:03:09.412731855 +0000 UTC m=+52.412378750 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.500606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.500645 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.500655 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.500681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.500692 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.513312 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.513400 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.513496 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.513526 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.513537 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.513549 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.513572 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.513585 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.513595 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 16:03:09.513577309 +0000 UTC m=+52.513224234 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.513638 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 16:03:09.51362097 +0000 UTC m=+52.513267895 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.602940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.602980 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.602989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.603003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.603011 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.705577 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.705625 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.705638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.705656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.705668 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.706899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.706954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.706970 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.706988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.707000 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.721550 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.721586 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.721612 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.721694 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.721814 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.721937 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.722322 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.725794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.725831 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.725839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.725851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.725861 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.735978 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.739813 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.739838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.739847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.739861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.739870 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.752730 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.756116 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.756143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.756151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.756165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.756174 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.770116 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.773991 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.774024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.774034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.774064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.774077 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.786738 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.786901 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.807313 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.807357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.807373 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.807391 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.807403 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.910039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.910132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.910148 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.910168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.910181 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:53Z","lastTransitionTime":"2025-10-01T16:02:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.972440 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/1.log" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.976705 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/0.log" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.982555 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerID="0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9" exitCode=1 Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.982607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9"} Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.982660 4764 scope.go:117] "RemoveContainer" containerID="04ff53dc4fdee84e73e54308b71fd02e51d828caaedc644ca3ef3e03cd6b8620" Oct 01 16:02:53 crc kubenswrapper[4764]: I1001 16:02:53.986007 4764 scope.go:117] "RemoveContainer" containerID="0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9" Oct 01 16:02:53 crc kubenswrapper[4764]: E1001 16:02:53.987725 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.002352 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.012573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.012634 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.012656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.012687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.012709 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:54Z","lastTransitionTime":"2025-10-01T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.017448 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.031832 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.045076 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.057883 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.068697 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.085761 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ff53dc4fdee84e73e54308b71fd02e51d828caaedc644ca3ef3e03cd6b8620\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:52Z\\\",\\\"message\\\":\\\" 6052 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 16:02:52.306547 6052 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 16:02:52.306813 6052 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 16:02:52.306847 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 16:02:52.306852 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 16:02:52.306863 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 16:02:52.306887 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 16:02:52.306893 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 16:02:52.306903 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 16:02:52.306910 6052 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 16:02:52.306909 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 16:02:52.306901 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 16:02:52.306942 6052 factory.go:656] Stopping watch factory\\\\nI1001 16:02:52.306947 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 16:02:52.306959 6052 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"ipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1001 16:02:53.815992 6205 services_controller.go:452] Built service openshift-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1001 16:02:53.816002 6205 services_controller.go:453] Built service openshift-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF1001 16:02:53.816007 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.095818 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.105190 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.114308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.114354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.114366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.114383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.114395 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:54Z","lastTransitionTime":"2025-10-01T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.127846 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.140637 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.152135 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.163986 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.177467 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.190526 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:54Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.217018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.217070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.217083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.217095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.217104 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:54Z","lastTransitionTime":"2025-10-01T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.320033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.320107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.320125 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.320149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.320168 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:54Z","lastTransitionTime":"2025-10-01T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.422935 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.423001 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.423013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.423030 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.423043 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:54Z","lastTransitionTime":"2025-10-01T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.525395 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.525432 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.525440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.525453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.525463 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:54Z","lastTransitionTime":"2025-10-01T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.626997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.627092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.627111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.627131 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.627146 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:54Z","lastTransitionTime":"2025-10-01T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.729556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.729603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.729620 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.729641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.729658 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:54Z","lastTransitionTime":"2025-10-01T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.832763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.832804 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.832814 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.832828 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.832837 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:54Z","lastTransitionTime":"2025-10-01T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.936801 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.936831 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.936839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.936851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.936859 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:54Z","lastTransitionTime":"2025-10-01T16:02:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:54 crc kubenswrapper[4764]: I1001 16:02:54.987510 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/1.log" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.038915 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.038966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.038979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.038995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.039007 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:55Z","lastTransitionTime":"2025-10-01T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.141235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.141302 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.141318 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.141342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.141357 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:55Z","lastTransitionTime":"2025-10-01T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.244645 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.244692 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.244706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.244724 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.244738 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:55Z","lastTransitionTime":"2025-10-01T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.347601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.347641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.347650 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.347670 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.347679 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:55Z","lastTransitionTime":"2025-10-01T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.451468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.451534 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.451557 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.451586 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.451606 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:55Z","lastTransitionTime":"2025-10-01T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.473322 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g"] Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.475339 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.478249 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.478997 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.503309 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ff53dc4fdee84e73e54308b71fd02e51d828caaedc644ca3ef3e03cd6b8620\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:52Z\\\",\\\"message\\\":\\\" 6052 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 16:02:52.306547 6052 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 16:02:52.306813 6052 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 16:02:52.306847 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 16:02:52.306852 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 16:02:52.306863 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 16:02:52.306887 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 16:02:52.306893 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 16:02:52.306903 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 16:02:52.306910 6052 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 16:02:52.306909 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 16:02:52.306901 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 16:02:52.306942 6052 factory.go:656] Stopping watch factory\\\\nI1001 16:02:52.306947 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 16:02:52.306959 6052 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"ipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1001 16:02:53.815992 6205 services_controller.go:452] Built service openshift-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1001 16:02:53.816002 6205 services_controller.go:453] Built service openshift-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF1001 16:02:53.816007 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.520701 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.541122 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.554966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.555033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.555094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.555136 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.555155 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:55Z","lastTransitionTime":"2025-10-01T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.557786 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.616081 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.631479 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvzwr\" (UniqueName: \"kubernetes.io/projected/321c50a8-5c97-4d27-9e2e-5ec64a57905a-kube-api-access-rvzwr\") pod \"ovnkube-control-plane-749d76644c-dt47g\" (UID: \"321c50a8-5c97-4d27-9e2e-5ec64a57905a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.631543 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/321c50a8-5c97-4d27-9e2e-5ec64a57905a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dt47g\" (UID: \"321c50a8-5c97-4d27-9e2e-5ec64a57905a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.631593 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/321c50a8-5c97-4d27-9e2e-5ec64a57905a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dt47g\" (UID: \"321c50a8-5c97-4d27-9e2e-5ec64a57905a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.631656 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/321c50a8-5c97-4d27-9e2e-5ec64a57905a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dt47g\" (UID: \"321c50a8-5c97-4d27-9e2e-5ec64a57905a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.641466 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.658084 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.658128 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.658142 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.658160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.658171 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:55Z","lastTransitionTime":"2025-10-01T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.661571 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.672766 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.688544 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.701158 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.716231 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.720822 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.720871 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:55 crc kubenswrapper[4764]: E1001 16:02:55.720922 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:02:55 crc kubenswrapper[4764]: E1001 16:02:55.721090 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.721199 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:55 crc kubenswrapper[4764]: E1001 16:02:55.721427 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.729296 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.732007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/321c50a8-5c97-4d27-9e2e-5ec64a57905a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dt47g\" (UID: \"321c50a8-5c97-4d27-9e2e-5ec64a57905a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.732084 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/321c50a8-5c97-4d27-9e2e-5ec64a57905a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dt47g\" (UID: \"321c50a8-5c97-4d27-9e2e-5ec64a57905a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.732137 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/321c50a8-5c97-4d27-9e2e-5ec64a57905a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dt47g\" (UID: \"321c50a8-5c97-4d27-9e2e-5ec64a57905a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.732163 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvzwr\" (UniqueName: \"kubernetes.io/projected/321c50a8-5c97-4d27-9e2e-5ec64a57905a-kube-api-access-rvzwr\") pod \"ovnkube-control-plane-749d76644c-dt47g\" (UID: \"321c50a8-5c97-4d27-9e2e-5ec64a57905a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.732739 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/321c50a8-5c97-4d27-9e2e-5ec64a57905a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-dt47g\" (UID: \"321c50a8-5c97-4d27-9e2e-5ec64a57905a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.732899 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/321c50a8-5c97-4d27-9e2e-5ec64a57905a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-dt47g\" (UID: \"321c50a8-5c97-4d27-9e2e-5ec64a57905a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.737342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/321c50a8-5c97-4d27-9e2e-5ec64a57905a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-dt47g\" (UID: \"321c50a8-5c97-4d27-9e2e-5ec64a57905a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.748676 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvzwr\" (UniqueName: \"kubernetes.io/projected/321c50a8-5c97-4d27-9e2e-5ec64a57905a-kube-api-access-rvzwr\") pod \"ovnkube-control-plane-749d76644c-dt47g\" (UID: \"321c50a8-5c97-4d27-9e2e-5ec64a57905a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.749090 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.760450 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.760483 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.760496 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.760513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.760524 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:55Z","lastTransitionTime":"2025-10-01T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.763868 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.779555 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.792879 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.798078 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" Oct 01 16:02:55 crc kubenswrapper[4764]: W1001 16:02:55.808983 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod321c50a8_5c97_4d27_9e2e_5ec64a57905a.slice/crio-cdda25b676d0ce439b31f21beaf0a54dae7a9259b384b4f48da871df9a573843 WatchSource:0}: Error finding container cdda25b676d0ce439b31f21beaf0a54dae7a9259b384b4f48da871df9a573843: Status 404 returned error can't find the container with id cdda25b676d0ce439b31f21beaf0a54dae7a9259b384b4f48da871df9a573843 Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.863601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.863637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.863648 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.863663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.863673 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:55Z","lastTransitionTime":"2025-10-01T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.965229 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.965419 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.965431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.965445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.965456 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:55Z","lastTransitionTime":"2025-10-01T16:02:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:55 crc kubenswrapper[4764]: I1001 16:02:55.995914 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" event={"ID":"321c50a8-5c97-4d27-9e2e-5ec64a57905a","Type":"ContainerStarted","Data":"cdda25b676d0ce439b31f21beaf0a54dae7a9259b384b4f48da871df9a573843"} Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.068418 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.068454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.068467 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.068485 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.068495 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:56Z","lastTransitionTime":"2025-10-01T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.174209 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.174243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.174251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.174265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.174275 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:56Z","lastTransitionTime":"2025-10-01T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.276255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.276289 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.276297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.276309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.276317 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:56Z","lastTransitionTime":"2025-10-01T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.379198 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.379488 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.379651 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.379787 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.379923 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:56Z","lastTransitionTime":"2025-10-01T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.483259 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.483312 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.483329 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.483352 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.483369 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:56Z","lastTransitionTime":"2025-10-01T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.580743 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-btbfp"] Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.581393 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:02:56 crc kubenswrapper[4764]: E1001 16:02:56.581481 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.586156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.586419 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.586543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.586727 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.586858 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:56Z","lastTransitionTime":"2025-10-01T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.598672 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.610287 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.625816 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.637461 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.639576 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7qxb\" (UniqueName: \"kubernetes.io/projected/41a0358d-ae10-4282-9423-8f3599adbc2a-kube-api-access-r7qxb\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.639650 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.652485 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.663607 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.675481 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.686220 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.689375 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.689403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.689414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.689429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.689440 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:56Z","lastTransitionTime":"2025-10-01T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.706122 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ff53dc4fdee84e73e54308b71fd02e51d828caaedc644ca3ef3e03cd6b8620\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:52Z\\\",\\\"message\\\":\\\" 6052 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 16:02:52.306547 6052 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 16:02:52.306813 6052 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 16:02:52.306847 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 16:02:52.306852 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 16:02:52.306863 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 16:02:52.306887 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 16:02:52.306893 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 16:02:52.306903 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 16:02:52.306910 6052 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 16:02:52.306909 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 16:02:52.306901 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 16:02:52.306942 6052 factory.go:656] Stopping watch factory\\\\nI1001 16:02:52.306947 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 16:02:52.306959 6052 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"ipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1001 16:02:53.815992 6205 services_controller.go:452] Built service openshift-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1001 16:02:53.816002 6205 services_controller.go:453] Built service openshift-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF1001 16:02:53.816007 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.716654 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.751013 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.751115 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7qxb\" (UniqueName: \"kubernetes.io/projected/41a0358d-ae10-4282-9423-8f3599adbc2a-kube-api-access-r7qxb\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:02:56 crc kubenswrapper[4764]: E1001 16:02:56.751175 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:02:56 crc kubenswrapper[4764]: E1001 16:02:56.751256 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs podName:41a0358d-ae10-4282-9423-8f3599adbc2a nodeName:}" failed. No retries permitted until 2025-10-01 16:02:57.251235491 +0000 UTC m=+40.250882326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs") pod "network-metrics-daemon-btbfp" (UID: "41a0358d-ae10-4282-9423-8f3599adbc2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.752178 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.762803 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.767881 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7qxb\" (UniqueName: \"kubernetes.io/projected/41a0358d-ae10-4282-9423-8f3599adbc2a-kube-api-access-r7qxb\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.782824 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.792387 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.792418 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.792427 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.792441 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.792450 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:56Z","lastTransitionTime":"2025-10-01T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.795096 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.810209 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.823595 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.837284 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:56Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.895693 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.895744 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.895760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.895781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.895795 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:56Z","lastTransitionTime":"2025-10-01T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.998965 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.999004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.999016 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.999033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:56 crc kubenswrapper[4764]: I1001 16:02:56.999084 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:56Z","lastTransitionTime":"2025-10-01T16:02:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.000785 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" event={"ID":"321c50a8-5c97-4d27-9e2e-5ec64a57905a","Type":"ContainerStarted","Data":"c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310"} Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.000824 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" event={"ID":"321c50a8-5c97-4d27-9e2e-5ec64a57905a","Type":"ContainerStarted","Data":"efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f"} Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.022526 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.037651 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.052382 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.067136 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.078476 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.089528 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.101309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.101351 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.101363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.101382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.101394 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:57Z","lastTransitionTime":"2025-10-01T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.103467 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.116654 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.129368 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.141893 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.151695 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.162057 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.171859 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.184840 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.195806 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.203096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.203135 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.203147 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.203164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.203175 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:57Z","lastTransitionTime":"2025-10-01T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.215224 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ff53dc4fdee84e73e54308b71fd02e51d828caaedc644ca3ef3e03cd6b8620\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:52Z\\\",\\\"message\\\":\\\" 6052 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 16:02:52.306547 6052 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 16:02:52.306813 6052 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 16:02:52.306847 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 16:02:52.306852 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 16:02:52.306863 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 16:02:52.306887 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 16:02:52.306893 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 16:02:52.306903 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 16:02:52.306910 6052 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 16:02:52.306909 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 16:02:52.306901 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 16:02:52.306942 6052 factory.go:656] Stopping watch factory\\\\nI1001 16:02:52.306947 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 16:02:52.306959 6052 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"ipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1001 16:02:53.815992 6205 services_controller.go:452] Built service openshift-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1001 16:02:53.816002 6205 services_controller.go:453] Built service openshift-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF1001 16:02:53.816007 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.226262 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.254806 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:02:57 crc kubenswrapper[4764]: E1001 16:02:57.254974 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:02:57 crc kubenswrapper[4764]: E1001 16:02:57.255058 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs podName:41a0358d-ae10-4282-9423-8f3599adbc2a nodeName:}" failed. No retries permitted until 2025-10-01 16:02:58.25502515 +0000 UTC m=+41.254671995 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs") pod "network-metrics-daemon-btbfp" (UID: "41a0358d-ae10-4282-9423-8f3599adbc2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.306364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.306421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.306433 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.306450 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.306461 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:57Z","lastTransitionTime":"2025-10-01T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.409716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.409773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.409789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.409812 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.409828 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:57Z","lastTransitionTime":"2025-10-01T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.512803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.512855 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.512867 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.512890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.512905 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:57Z","lastTransitionTime":"2025-10-01T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.615932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.615984 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.615996 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.616013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.616027 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:57Z","lastTransitionTime":"2025-10-01T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.718625 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.718709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.718733 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.718761 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.718784 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:57Z","lastTransitionTime":"2025-10-01T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.720822 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.720899 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:57 crc kubenswrapper[4764]: E1001 16:02:57.720965 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:02:57 crc kubenswrapper[4764]: E1001 16:02:57.721021 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.721100 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:57 crc kubenswrapper[4764]: E1001 16:02:57.721177 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.721254 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:02:57 crc kubenswrapper[4764]: E1001 16:02:57.721396 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.740505 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.754874 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.769903 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.786257 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.798682 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.814183 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.823328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.823362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.823585 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.823609 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.823621 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:57Z","lastTransitionTime":"2025-10-01T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.836599 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.846012 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.861216 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.877138 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.890852 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.905634 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.919505 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.927899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.927957 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.927975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.927998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.928016 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:57Z","lastTransitionTime":"2025-10-01T16:02:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.941302 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04ff53dc4fdee84e73e54308b71fd02e51d828caaedc644ca3ef3e03cd6b8620\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:52Z\\\",\\\"message\\\":\\\" 6052 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 16:02:52.306547 6052 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 16:02:52.306813 6052 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 16:02:52.306847 6052 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 16:02:52.306852 6052 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 16:02:52.306863 6052 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 16:02:52.306887 6052 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 16:02:52.306893 6052 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 16:02:52.306903 6052 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 16:02:52.306910 6052 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 16:02:52.306909 6052 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 16:02:52.306901 6052 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 16:02:52.306942 6052 factory.go:656] Stopping watch factory\\\\nI1001 16:02:52.306947 6052 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 16:02:52.306959 6052 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"ipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1001 16:02:53.815992 6205 services_controller.go:452] Built service openshift-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1001 16:02:53.816002 6205 services_controller.go:453] Built service openshift-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF1001 16:02:53.816007 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.952840 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.966401 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.979171 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.980125 4764 scope.go:117] "RemoveContainer" containerID="0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9" Oct 01 16:02:57 crc kubenswrapper[4764]: E1001 16:02:57.980593 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" Oct 01 16:02:57 crc kubenswrapper[4764]: I1001 16:02:57.986229 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.000473 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.014830 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.031330 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.031378 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.031390 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.031406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.031417 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:58Z","lastTransitionTime":"2025-10-01T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.032065 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"ipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1001 16:02:53.815992 6205 services_controller.go:452] Built service openshift-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1001 16:02:53.816002 6205 services_controller.go:453] Built service openshift-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF1001 16:02:53.816007 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.043731 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.056538 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.065471 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.089203 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.099765 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.111614 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.122563 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.133948 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.134077 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.134111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.134123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.134140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.134152 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:58Z","lastTransitionTime":"2025-10-01T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.146022 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.155635 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.167947 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.178965 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.191409 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.203791 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:58Z is after 2025-08-24T17:21:41Z" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.236734 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.236767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.236775 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.236787 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.236795 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:58Z","lastTransitionTime":"2025-10-01T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.265803 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:02:58 crc kubenswrapper[4764]: E1001 16:02:58.265921 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:02:58 crc kubenswrapper[4764]: E1001 16:02:58.265967 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs podName:41a0358d-ae10-4282-9423-8f3599adbc2a nodeName:}" failed. No retries permitted until 2025-10-01 16:03:00.265954964 +0000 UTC m=+43.265601799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs") pod "network-metrics-daemon-btbfp" (UID: "41a0358d-ae10-4282-9423-8f3599adbc2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.339648 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.339685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.339694 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.339710 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.339719 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:58Z","lastTransitionTime":"2025-10-01T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.441792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.441839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.441854 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.441874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.441895 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:58Z","lastTransitionTime":"2025-10-01T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.544563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.544612 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.544624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.544641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.544653 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:58Z","lastTransitionTime":"2025-10-01T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.647914 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.647974 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.647996 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.648025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.648087 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:58Z","lastTransitionTime":"2025-10-01T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.750574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.750634 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.750647 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.750664 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.750676 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:58Z","lastTransitionTime":"2025-10-01T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.853194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.853232 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.853243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.853259 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.853270 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:58Z","lastTransitionTime":"2025-10-01T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.955864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.955907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.955922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.955942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:58 crc kubenswrapper[4764]: I1001 16:02:58.955958 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:58Z","lastTransitionTime":"2025-10-01T16:02:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.058654 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.058751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.058768 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.058782 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.058798 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:59Z","lastTransitionTime":"2025-10-01T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.161615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.161669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.161685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.161708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.161724 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:59Z","lastTransitionTime":"2025-10-01T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.264914 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.264946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.264954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.264966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.264974 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:59Z","lastTransitionTime":"2025-10-01T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.367509 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.367585 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.367607 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.367634 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.367655 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:59Z","lastTransitionTime":"2025-10-01T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.469772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.469841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.469866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.469894 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.469916 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:59Z","lastTransitionTime":"2025-10-01T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.574774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.574851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.574868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.574895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.574912 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:59Z","lastTransitionTime":"2025-10-01T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.678565 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.678644 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.678668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.678698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.678720 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:59Z","lastTransitionTime":"2025-10-01T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.721461 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.721541 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.721569 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:02:59 crc kubenswrapper[4764]: E1001 16:02:59.724383 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.725316 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:02:59 crc kubenswrapper[4764]: E1001 16:02:59.725550 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:02:59 crc kubenswrapper[4764]: E1001 16:02:59.725816 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:02:59 crc kubenswrapper[4764]: E1001 16:02:59.726025 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.782251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.782342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.782367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.782397 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.782422 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:59Z","lastTransitionTime":"2025-10-01T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.886441 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.886478 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.886489 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.886505 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.886517 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:59Z","lastTransitionTime":"2025-10-01T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.989709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.989784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.989809 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.989840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:02:59 crc kubenswrapper[4764]: I1001 16:02:59.989864 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:02:59Z","lastTransitionTime":"2025-10-01T16:02:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.093124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.093186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.093205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.093228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.093245 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:00Z","lastTransitionTime":"2025-10-01T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.196334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.196402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.196426 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.196456 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.196477 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:00Z","lastTransitionTime":"2025-10-01T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.285367 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:00 crc kubenswrapper[4764]: E1001 16:03:00.285586 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:03:00 crc kubenswrapper[4764]: E1001 16:03:00.285696 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs podName:41a0358d-ae10-4282-9423-8f3599adbc2a nodeName:}" failed. No retries permitted until 2025-10-01 16:03:04.285670484 +0000 UTC m=+47.285317359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs") pod "network-metrics-daemon-btbfp" (UID: "41a0358d-ae10-4282-9423-8f3599adbc2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.299018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.299139 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.299173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.299207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.299233 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:00Z","lastTransitionTime":"2025-10-01T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.401982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.402114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.402142 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.402172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.402193 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:00Z","lastTransitionTime":"2025-10-01T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.504882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.504942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.504965 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.504994 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.505018 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:00Z","lastTransitionTime":"2025-10-01T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.607911 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.607986 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.608013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.608041 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.608128 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:00Z","lastTransitionTime":"2025-10-01T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.711277 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.711412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.711444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.711476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.711500 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:00Z","lastTransitionTime":"2025-10-01T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.815765 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.815832 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.815863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.815895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.815920 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:00Z","lastTransitionTime":"2025-10-01T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.918380 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.918525 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.918558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.918578 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:00 crc kubenswrapper[4764]: I1001 16:03:00.918590 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:00Z","lastTransitionTime":"2025-10-01T16:03:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.021019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.021105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.021128 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.021151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.021168 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:01Z","lastTransitionTime":"2025-10-01T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.123756 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.123809 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.123821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.123840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.123855 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:01Z","lastTransitionTime":"2025-10-01T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.226138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.226182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.226191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.226205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.226214 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:01Z","lastTransitionTime":"2025-10-01T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.328871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.328909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.328918 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.328932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.328942 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:01Z","lastTransitionTime":"2025-10-01T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.431885 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.431943 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.431958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.431986 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.432008 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:01Z","lastTransitionTime":"2025-10-01T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.534838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.534886 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.534895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.534912 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.534922 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:01Z","lastTransitionTime":"2025-10-01T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.638302 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.638367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.638382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.638404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.638422 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:01Z","lastTransitionTime":"2025-10-01T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.721324 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.721383 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:01 crc kubenswrapper[4764]: E1001 16:03:01.721581 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.721623 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.721603 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:01 crc kubenswrapper[4764]: E1001 16:03:01.721831 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:01 crc kubenswrapper[4764]: E1001 16:03:01.722209 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:01 crc kubenswrapper[4764]: E1001 16:03:01.722307 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.741269 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.741349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.741366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.741393 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.741409 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:01Z","lastTransitionTime":"2025-10-01T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.843727 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.843759 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.843767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.843780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.843789 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:01Z","lastTransitionTime":"2025-10-01T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.946753 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.946799 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.946822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.946839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:01 crc kubenswrapper[4764]: I1001 16:03:01.946847 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:01Z","lastTransitionTime":"2025-10-01T16:03:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.049153 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.049446 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.049560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.049695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.049935 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:02Z","lastTransitionTime":"2025-10-01T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.153025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.153561 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.153651 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.153738 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.153817 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:02Z","lastTransitionTime":"2025-10-01T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.256388 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.256616 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.256683 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.256748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.256802 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:02Z","lastTransitionTime":"2025-10-01T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.360000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.360123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.360190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.360220 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.360235 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:02Z","lastTransitionTime":"2025-10-01T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.463082 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.463111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.463119 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.463147 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.463155 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:02Z","lastTransitionTime":"2025-10-01T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.566868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.566926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.566939 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.566962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.566976 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:02Z","lastTransitionTime":"2025-10-01T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.669852 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.669884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.669893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.669907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.669916 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:02Z","lastTransitionTime":"2025-10-01T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.772317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.772360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.772369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.772383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.772391 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:02Z","lastTransitionTime":"2025-10-01T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.874425 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.874688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.874880 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.875072 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.875237 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:02Z","lastTransitionTime":"2025-10-01T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.978727 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.978792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.978810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.978841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:02 crc kubenswrapper[4764]: I1001 16:03:02.978861 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:02Z","lastTransitionTime":"2025-10-01T16:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.082207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.082524 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.082744 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.082931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.083134 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:03Z","lastTransitionTime":"2025-10-01T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.186141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.186738 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.186968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.187103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.187338 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:03Z","lastTransitionTime":"2025-10-01T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.289597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.289840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.289922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.290283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.290361 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:03Z","lastTransitionTime":"2025-10-01T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.397794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.397842 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.397851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.397864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.397874 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:03Z","lastTransitionTime":"2025-10-01T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.500626 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.500666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.500706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.500720 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.500731 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:03Z","lastTransitionTime":"2025-10-01T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.604647 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.605277 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.605295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.605324 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.605339 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:03Z","lastTransitionTime":"2025-10-01T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.708450 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.708519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.708541 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.708571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.708589 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:03Z","lastTransitionTime":"2025-10-01T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.721653 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.721809 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.721835 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:03 crc kubenswrapper[4764]: E1001 16:03:03.721988 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.722328 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:03 crc kubenswrapper[4764]: E1001 16:03:03.722446 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:03 crc kubenswrapper[4764]: E1001 16:03:03.722491 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:03 crc kubenswrapper[4764]: E1001 16:03:03.722644 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.810781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.810817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.810826 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.810839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.810848 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:03Z","lastTransitionTime":"2025-10-01T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.913247 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.913285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.913297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.913313 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.913324 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:03Z","lastTransitionTime":"2025-10-01T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.976952 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.977003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.977019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.977039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.977076 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:03Z","lastTransitionTime":"2025-10-01T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:03 crc kubenswrapper[4764]: E1001 16:03:03.989155 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:03Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.992825 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.992882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.992896 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.992925 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:03 crc kubenswrapper[4764]: I1001 16:03:03.992941 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:03Z","lastTransitionTime":"2025-10-01T16:03:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:04 crc kubenswrapper[4764]: E1001 16:03:04.008800 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:04Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.013003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.013062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.013074 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.013088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.013097 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:04Z","lastTransitionTime":"2025-10-01T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:04 crc kubenswrapper[4764]: E1001 16:03:04.030359 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:04Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.034440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.034497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.034512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.034530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.034588 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:04Z","lastTransitionTime":"2025-10-01T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:04 crc kubenswrapper[4764]: E1001 16:03:04.050102 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:04Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.054455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.054499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.054510 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.054534 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.054556 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:04Z","lastTransitionTime":"2025-10-01T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:04 crc kubenswrapper[4764]: E1001 16:03:04.069114 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:04Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:04 crc kubenswrapper[4764]: E1001 16:03:04.069266 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.071014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.071095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.071112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.071134 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.071149 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:04Z","lastTransitionTime":"2025-10-01T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.174455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.174493 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.174502 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.174517 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.174526 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:04Z","lastTransitionTime":"2025-10-01T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.276946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.276984 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.276996 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.277015 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.277040 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:04Z","lastTransitionTime":"2025-10-01T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.325905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:04 crc kubenswrapper[4764]: E1001 16:03:04.326027 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:03:04 crc kubenswrapper[4764]: E1001 16:03:04.326124 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs podName:41a0358d-ae10-4282-9423-8f3599adbc2a nodeName:}" failed. No retries permitted until 2025-10-01 16:03:12.326106885 +0000 UTC m=+55.325753720 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs") pod "network-metrics-daemon-btbfp" (UID: "41a0358d-ae10-4282-9423-8f3599adbc2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.379344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.379451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.379472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.379493 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.379513 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:04Z","lastTransitionTime":"2025-10-01T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.482507 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.482547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.482558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.482570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.482579 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:04Z","lastTransitionTime":"2025-10-01T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.585440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.585486 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.585527 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.585549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.585565 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:04Z","lastTransitionTime":"2025-10-01T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.688230 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.688276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.688295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.688316 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.688331 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:04Z","lastTransitionTime":"2025-10-01T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.791111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.791149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.791158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.791172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.791180 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:04Z","lastTransitionTime":"2025-10-01T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.893062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.893117 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.893136 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.893156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.893169 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:04Z","lastTransitionTime":"2025-10-01T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.996100 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.996164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.996178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.996205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:04 crc kubenswrapper[4764]: I1001 16:03:04.996223 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:04Z","lastTransitionTime":"2025-10-01T16:03:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.100316 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.100394 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.100404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.100421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.100436 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:05Z","lastTransitionTime":"2025-10-01T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.203792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.203859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.203871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.203889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.203902 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:05Z","lastTransitionTime":"2025-10-01T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.306445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.306491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.306501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.306515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.306525 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:05Z","lastTransitionTime":"2025-10-01T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.409189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.409252 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.409270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.409291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.409302 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:05Z","lastTransitionTime":"2025-10-01T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.511340 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.511432 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.511448 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.511490 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.511501 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:05Z","lastTransitionTime":"2025-10-01T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.613722 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.613797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.613824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.613847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.613860 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:05Z","lastTransitionTime":"2025-10-01T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.716252 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.716290 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.716302 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.716317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.716328 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:05Z","lastTransitionTime":"2025-10-01T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.721732 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:05 crc kubenswrapper[4764]: E1001 16:03:05.721841 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.722293 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.722396 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:05 crc kubenswrapper[4764]: E1001 16:03:05.722399 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.722454 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:05 crc kubenswrapper[4764]: E1001 16:03:05.722478 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:05 crc kubenswrapper[4764]: E1001 16:03:05.722579 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.823826 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.823877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.823890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.823907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.823919 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:05Z","lastTransitionTime":"2025-10-01T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.926418 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.926483 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.926506 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.926536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:05 crc kubenswrapper[4764]: I1001 16:03:05.926560 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:05Z","lastTransitionTime":"2025-10-01T16:03:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.029304 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.029350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.029364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.029380 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.029390 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:06Z","lastTransitionTime":"2025-10-01T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.131529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.131570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.131581 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.131597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.131608 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:06Z","lastTransitionTime":"2025-10-01T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.233843 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.233903 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.233917 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.233938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.233953 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:06Z","lastTransitionTime":"2025-10-01T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.336321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.336362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.336370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.336387 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.336398 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:06Z","lastTransitionTime":"2025-10-01T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.438617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.438894 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.438966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.439037 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.439163 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:06Z","lastTransitionTime":"2025-10-01T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.542089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.542138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.542151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.542170 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.542183 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:06Z","lastTransitionTime":"2025-10-01T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.645206 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.645245 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.645270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.645292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.645306 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:06Z","lastTransitionTime":"2025-10-01T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.748412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.748463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.748474 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.748493 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.748508 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:06Z","lastTransitionTime":"2025-10-01T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.851286 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.851343 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.851357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.851374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.851389 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:06Z","lastTransitionTime":"2025-10-01T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.954690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.954724 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.954735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.954750 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:06 crc kubenswrapper[4764]: I1001 16:03:06.954761 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:06Z","lastTransitionTime":"2025-10-01T16:03:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.057529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.057565 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.057576 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.057591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.057602 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:07Z","lastTransitionTime":"2025-10-01T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.159611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.159647 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.159659 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.159673 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.159684 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:07Z","lastTransitionTime":"2025-10-01T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.261297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.261327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.261336 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.261349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.261357 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:07Z","lastTransitionTime":"2025-10-01T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.363766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.363800 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.363808 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.363826 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.363840 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:07Z","lastTransitionTime":"2025-10-01T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.466356 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.466389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.466397 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.466411 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.466421 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:07Z","lastTransitionTime":"2025-10-01T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.568614 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.568647 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.568656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.568670 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.568678 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:07Z","lastTransitionTime":"2025-10-01T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.673627 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.673711 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.673730 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.673758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.673777 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:07Z","lastTransitionTime":"2025-10-01T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.721704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.721775 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.721893 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:07 crc kubenswrapper[4764]: E1001 16:03:07.721990 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.722041 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:07 crc kubenswrapper[4764]: E1001 16:03:07.722206 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:07 crc kubenswrapper[4764]: E1001 16:03:07.722525 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:07 crc kubenswrapper[4764]: E1001 16:03:07.722549 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.733567 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.746253 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.759612 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.775733 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.776026 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.776131 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.776211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.776272 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:07Z","lastTransitionTime":"2025-10-01T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.779122 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"ipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1001 16:02:53.815992 6205 services_controller.go:452] Built service openshift-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1001 16:02:53.816002 6205 services_controller.go:453] Built service openshift-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF1001 16:02:53.816007 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.800777 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.816282 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.834086 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.844755 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.857224 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.870999 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.880213 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.880277 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.880293 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.880318 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.880333 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:07Z","lastTransitionTime":"2025-10-01T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.885574 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.898590 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.910663 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.925171 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.940744 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.955778 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.968564 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.982101 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.982135 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.982143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.982156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:07 crc kubenswrapper[4764]: I1001 16:03:07.982164 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:07Z","lastTransitionTime":"2025-10-01T16:03:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.084415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.084462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.084475 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.084492 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.084504 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:08Z","lastTransitionTime":"2025-10-01T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.187585 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.187640 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.187660 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.187685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.187702 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:08Z","lastTransitionTime":"2025-10-01T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.289981 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.290098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.290148 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.290173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.290189 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:08Z","lastTransitionTime":"2025-10-01T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.392229 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.392298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.392311 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.392333 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.392347 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:08Z","lastTransitionTime":"2025-10-01T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.495843 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.495920 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.495940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.495971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.495989 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:08Z","lastTransitionTime":"2025-10-01T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.598779 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.598822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.598830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.598853 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.598872 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:08Z","lastTransitionTime":"2025-10-01T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.702691 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.702737 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.702749 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.702767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.702782 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:08Z","lastTransitionTime":"2025-10-01T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.805242 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.805291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.805305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.805321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.805332 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:08Z","lastTransitionTime":"2025-10-01T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.907978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.908020 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.908031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.908070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:08 crc kubenswrapper[4764]: I1001 16:03:08.908085 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:08Z","lastTransitionTime":"2025-10-01T16:03:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.010601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.010637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.010645 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.010657 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.010666 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:09Z","lastTransitionTime":"2025-10-01T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.112955 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.112988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.113001 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.113018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.113031 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:09Z","lastTransitionTime":"2025-10-01T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.216270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.216871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.217134 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.217362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.217529 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:09Z","lastTransitionTime":"2025-10-01T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.320382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.320423 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.320440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.320454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.320463 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:09Z","lastTransitionTime":"2025-10-01T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.423674 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.423726 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.423743 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.423765 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.423781 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:09Z","lastTransitionTime":"2025-10-01T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.478415 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.478526 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.478586 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.478724 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.478787 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:03:41.478771133 +0000 UTC m=+84.478417968 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.478797 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.478834 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:03:41.478824725 +0000 UTC m=+84.478471550 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.478945 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:03:41.478932447 +0000 UTC m=+84.478579282 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.526326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.526368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.526380 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.526395 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.526406 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:09Z","lastTransitionTime":"2025-10-01T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.579396 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.579670 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.579871 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.579960 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.580018 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.580145 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 16:03:41.580127779 +0000 UTC m=+84.579774614 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.580223 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.580366 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.580444 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.580553 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 16:03:41.580541629 +0000 UTC m=+84.580188464 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.628635 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.628890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.629001 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.629121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.629228 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:09Z","lastTransitionTime":"2025-10-01T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.720825 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.720978 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.721087 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.721230 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.721087 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.721326 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.721330 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:09 crc kubenswrapper[4764]: E1001 16:03:09.721426 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.730938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.730967 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.730978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.730992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.731003 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:09Z","lastTransitionTime":"2025-10-01T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.796038 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.806665 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.837369 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:09Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.842549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.842576 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.842587 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.842601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.842613 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:09Z","lastTransitionTime":"2025-10-01T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.864437 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:09Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.879355 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:09Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.889197 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:09Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.904069 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:09Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.917002 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:09Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.931924 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:09Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.941827 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:09Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.944443 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.944547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.944559 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.944574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.944585 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:09Z","lastTransitionTime":"2025-10-01T16:03:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.951514 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:09Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.964411 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:09Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.974946 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:09Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.985370 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:09Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:09 crc kubenswrapper[4764]: I1001 16:03:09.993716 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:09Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.009850 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"ipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1001 16:02:53.815992 6205 services_controller.go:452] Built service openshift-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1001 16:02:53.816002 6205 services_controller.go:453] Built service openshift-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF1001 16:02:53.816007 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:10Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.020632 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:10Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.032499 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:10Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.044662 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:10Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.046096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.046123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.046133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.046147 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.046158 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:10Z","lastTransitionTime":"2025-10-01T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.150961 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.150997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.151009 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.151023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.151034 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:10Z","lastTransitionTime":"2025-10-01T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.253992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.254067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.254077 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.254097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.254108 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:10Z","lastTransitionTime":"2025-10-01T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.356801 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.356844 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.356857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.356872 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.356884 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:10Z","lastTransitionTime":"2025-10-01T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.459110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.459152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.459164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.459180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.459192 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:10Z","lastTransitionTime":"2025-10-01T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.561905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.561970 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.561990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.562016 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.562034 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:10Z","lastTransitionTime":"2025-10-01T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.664935 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.664968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.664978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.664993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.665006 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:10Z","lastTransitionTime":"2025-10-01T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.722155 4764 scope.go:117] "RemoveContainer" containerID="0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.768027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.768121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.768146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.768173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.768192 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:10Z","lastTransitionTime":"2025-10-01T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.870409 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.870434 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.870442 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.870457 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.870465 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:10Z","lastTransitionTime":"2025-10-01T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.972433 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.972464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.972474 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.972519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:10 crc kubenswrapper[4764]: I1001 16:03:10.972529 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:10Z","lastTransitionTime":"2025-10-01T16:03:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.045701 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/1.log" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.047631 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerStarted","Data":"fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea"} Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.048082 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.063094 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.074892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.074940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.074951 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.074968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.074979 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:11Z","lastTransitionTime":"2025-10-01T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.074985 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.086453 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.110753 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.129395 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.145776 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.159908 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.177183 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.177946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.177987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.177999 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.178016 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.178036 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:11Z","lastTransitionTime":"2025-10-01T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.191008 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.206938 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.220174 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.230189 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.242574 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.252546 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.261754 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.271630 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.280393 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.280426 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.280435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.280448 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.280457 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:11Z","lastTransitionTime":"2025-10-01T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.288491 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"ipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1001 16:02:53.815992 6205 services_controller.go:452] Built service openshift-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1001 16:02:53.816002 6205 services_controller.go:453] Built service openshift-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF1001 16:02:53.816007 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.298013 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:11Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.382982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.383024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.383032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.383066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.383078 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:11Z","lastTransitionTime":"2025-10-01T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.489603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.489647 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.489657 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.489672 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.489680 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:11Z","lastTransitionTime":"2025-10-01T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.591459 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.591495 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.591504 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.591551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.591561 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:11Z","lastTransitionTime":"2025-10-01T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.694226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.694265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.694274 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.694286 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.694294 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:11Z","lastTransitionTime":"2025-10-01T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.721090 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:11 crc kubenswrapper[4764]: E1001 16:03:11.721214 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.721244 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.721278 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.721090 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:11 crc kubenswrapper[4764]: E1001 16:03:11.721332 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:11 crc kubenswrapper[4764]: E1001 16:03:11.721389 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:11 crc kubenswrapper[4764]: E1001 16:03:11.721466 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.796857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.796916 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.796937 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.796967 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.796989 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:11Z","lastTransitionTime":"2025-10-01T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.899669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.899740 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.899752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.899767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:11 crc kubenswrapper[4764]: I1001 16:03:11.899777 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:11Z","lastTransitionTime":"2025-10-01T16:03:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.001946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.001980 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.001989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.002002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.002011 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:12Z","lastTransitionTime":"2025-10-01T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.052446 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/2.log" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.053280 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/1.log" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.055367 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerID="fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea" exitCode=1 Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.055401 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea"} Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.055434 4764 scope.go:117] "RemoveContainer" containerID="0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.056606 4764 scope.go:117] "RemoveContainer" containerID="fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea" Oct 01 16:03:12 crc kubenswrapper[4764]: E1001 16:03:12.056937 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.075316 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.096941 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b4d7a4a3f1d43fa7c7285a23f59629ec52fc3d86241639cfae2924aa77115d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:02:53Z\\\",\\\"message\\\":\\\"ipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1001 16:02:53.815992 6205 services_controller.go:452] Built service openshift-controller-manager-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1001 16:02:53.816002 6205 services_controller.go:453] Built service openshift-controller-manager-operator/metrics template LB for network=default: []services.LB{}\\\\nF1001 16:02:53.816007 6205 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:02:53Z is after 2025-08-24\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:53Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:11Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 16:03:11.583374 6425 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1001 16:03:11.583405 6425 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1001 16:03:11.583424 6425 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1001 16:03:11.583482 6425 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 16:03:11.583515 6425 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 16:03:11.583779 6425 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 16:03:11.583859 6425 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 16:03:11.583901 6425 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16:03:11.583931 6425 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 16:03:11.584009 6425 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.105165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.105210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.105222 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.105238 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.105248 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:12Z","lastTransitionTime":"2025-10-01T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.109292 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.120478 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.131635 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.142869 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.161291 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.175284 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.190193 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.204040 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.212404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.212447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.212460 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.212477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.212488 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:12Z","lastTransitionTime":"2025-10-01T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.216894 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.234771 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.246055 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.262123 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.275104 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.291341 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.303207 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.314398 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.314449 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.314462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.314481 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.314494 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:12Z","lastTransitionTime":"2025-10-01T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.317442 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:12Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.405122 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:12 crc kubenswrapper[4764]: E1001 16:03:12.405244 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:03:12 crc kubenswrapper[4764]: E1001 16:03:12.405318 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs podName:41a0358d-ae10-4282-9423-8f3599adbc2a nodeName:}" failed. No retries permitted until 2025-10-01 16:03:28.405299408 +0000 UTC m=+71.404946243 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs") pod "network-metrics-daemon-btbfp" (UID: "41a0358d-ae10-4282-9423-8f3599adbc2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.417065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.417111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.417125 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.417140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.417152 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:12Z","lastTransitionTime":"2025-10-01T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.519102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.519175 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.519186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.519201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.519211 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:12Z","lastTransitionTime":"2025-10-01T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.622131 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.622169 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.622179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.622197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.622216 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:12Z","lastTransitionTime":"2025-10-01T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.724875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.724946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.724970 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.725000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.725025 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:12Z","lastTransitionTime":"2025-10-01T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.828132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.828449 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.828539 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.828631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.828721 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:12Z","lastTransitionTime":"2025-10-01T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.931432 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.931495 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.931506 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.931521 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:12 crc kubenswrapper[4764]: I1001 16:03:12.931530 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:12Z","lastTransitionTime":"2025-10-01T16:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.033567 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.033609 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.033620 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.033635 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.033645 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:13Z","lastTransitionTime":"2025-10-01T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.059609 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/2.log" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.062583 4764 scope.go:117] "RemoveContainer" containerID="fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea" Oct 01 16:03:13 crc kubenswrapper[4764]: E1001 16:03:13.062718 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.073533 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.094162 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.110556 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.126154 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.136068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.136106 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.136117 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.136132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.136143 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:13Z","lastTransitionTime":"2025-10-01T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.140450 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.153826 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.168346 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.180685 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.193272 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.203955 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.215402 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.227259 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.238190 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.238244 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.238284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.238294 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.238312 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.238324 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:13Z","lastTransitionTime":"2025-10-01T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.251552 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.267646 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:11Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 16:03:11.583374 6425 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1001 16:03:11.583405 6425 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1001 16:03:11.583424 6425 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1001 16:03:11.583482 6425 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 16:03:11.583515 6425 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 16:03:11.583779 6425 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 16:03:11.583859 6425 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 16:03:11.583901 6425 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16:03:11.583931 6425 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 16:03:11.584009 6425 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.277811 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.287405 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.297526 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:13Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.340355 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.340387 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.340396 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.340412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.340421 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:13Z","lastTransitionTime":"2025-10-01T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.442936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.442990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.443004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.443025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.443063 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:13Z","lastTransitionTime":"2025-10-01T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.545946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.545993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.546010 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.546028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.546058 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:13Z","lastTransitionTime":"2025-10-01T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.648757 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.648810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.648821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.648840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.648852 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:13Z","lastTransitionTime":"2025-10-01T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.721251 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.721251 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.721379 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:13 crc kubenswrapper[4764]: E1001 16:03:13.721793 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:13 crc kubenswrapper[4764]: E1001 16:03:13.721661 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:13 crc kubenswrapper[4764]: E1001 16:03:13.721854 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.721378 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:13 crc kubenswrapper[4764]: E1001 16:03:13.721929 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.753342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.753390 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.753401 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.753418 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.753433 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:13Z","lastTransitionTime":"2025-10-01T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.855729 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.856084 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.856226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.856343 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.856429 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:13Z","lastTransitionTime":"2025-10-01T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.959476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.959567 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.959582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.959606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:13 crc kubenswrapper[4764]: I1001 16:03:13.959618 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:13Z","lastTransitionTime":"2025-10-01T16:03:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.062763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.063633 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.063726 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.063858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.063971 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.167607 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.167647 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.167657 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.167672 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.167685 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.271283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.271330 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.271344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.271363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.271377 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.272699 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.272733 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.272742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.272754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.272764 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: E1001 16:03:14.288114 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:14Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.292881 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.292927 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.292939 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.292958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.292972 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: E1001 16:03:14.306697 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:14Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.311334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.311368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.311377 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.311392 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.311404 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: E1001 16:03:14.323981 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:14Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.328276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.328301 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.328310 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.328323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.328332 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: E1001 16:03:14.340201 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:14Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.343190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.343225 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.343236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.343250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.343262 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: E1001 16:03:14.353955 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:14Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:14 crc kubenswrapper[4764]: E1001 16:03:14.354160 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.373627 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.373682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.373698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.373721 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.373737 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.476949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.477022 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.477081 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.477116 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.477138 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.579579 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.579632 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.579650 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.579678 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.579701 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.682057 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.682095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.682105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.682121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.682131 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.785763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.785844 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.785856 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.785875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.785888 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.889498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.889569 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.889588 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.889613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.889734 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.992405 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.992447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.992457 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.992472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:14 crc kubenswrapper[4764]: I1001 16:03:14.992482 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:14Z","lastTransitionTime":"2025-10-01T16:03:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.094513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.094560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.094573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.094592 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.094605 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:15Z","lastTransitionTime":"2025-10-01T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.197425 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.197737 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.197819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.197907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.197987 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:15Z","lastTransitionTime":"2025-10-01T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.300418 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.300453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.300464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.300479 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.300492 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:15Z","lastTransitionTime":"2025-10-01T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.403407 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.403452 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.403464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.403481 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.403492 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:15Z","lastTransitionTime":"2025-10-01T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.506394 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.506432 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.506441 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.506456 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.506465 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:15Z","lastTransitionTime":"2025-10-01T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.609877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.609914 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.609924 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.609940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.609952 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:15Z","lastTransitionTime":"2025-10-01T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.711911 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.712001 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.712017 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.712036 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.712075 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:15Z","lastTransitionTime":"2025-10-01T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.721343 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:15 crc kubenswrapper[4764]: E1001 16:03:15.721471 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.721484 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.721524 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.721523 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:15 crc kubenswrapper[4764]: E1001 16:03:15.721605 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:15 crc kubenswrapper[4764]: E1001 16:03:15.721715 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:15 crc kubenswrapper[4764]: E1001 16:03:15.721766 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.814853 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.815271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.815590 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.815728 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.815844 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:15Z","lastTransitionTime":"2025-10-01T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.918766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.918810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.918821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.918838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:15 crc kubenswrapper[4764]: I1001 16:03:15.918850 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:15Z","lastTransitionTime":"2025-10-01T16:03:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.021573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.021609 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.021636 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.021651 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.021660 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:16Z","lastTransitionTime":"2025-10-01T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.123432 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.123461 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.123469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.123482 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.123491 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:16Z","lastTransitionTime":"2025-10-01T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.226890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.227182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.227257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.227341 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.227417 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:16Z","lastTransitionTime":"2025-10-01T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.330117 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.330177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.330191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.330216 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.330232 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:16Z","lastTransitionTime":"2025-10-01T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.432487 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.432548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.432563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.432581 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.432593 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:16Z","lastTransitionTime":"2025-10-01T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.535234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.535497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.535665 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.535899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.535983 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:16Z","lastTransitionTime":"2025-10-01T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.638913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.638957 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.638965 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.638983 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.638993 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:16Z","lastTransitionTime":"2025-10-01T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.741017 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.741304 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.741396 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.741486 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.741568 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:16Z","lastTransitionTime":"2025-10-01T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.844266 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.844316 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.844332 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.844354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.844370 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:16Z","lastTransitionTime":"2025-10-01T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.947593 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.947640 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.947657 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.947677 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:16 crc kubenswrapper[4764]: I1001 16:03:16.947689 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:16Z","lastTransitionTime":"2025-10-01T16:03:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.051078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.051141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.051158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.051182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.051199 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:17Z","lastTransitionTime":"2025-10-01T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.153761 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.154120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.154243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.154363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.154664 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:17Z","lastTransitionTime":"2025-10-01T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.256826 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.256864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.256874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.256888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.256901 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:17Z","lastTransitionTime":"2025-10-01T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.359819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.360143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.360219 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.360282 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.360369 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:17Z","lastTransitionTime":"2025-10-01T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.462727 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.463023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.463164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.463266 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.463358 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:17Z","lastTransitionTime":"2025-10-01T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.566791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.566854 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.566865 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.566883 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.566897 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:17Z","lastTransitionTime":"2025-10-01T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.670290 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.670900 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.670998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.671118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.671209 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:17Z","lastTransitionTime":"2025-10-01T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.720771 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.720794 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.720818 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:17 crc kubenswrapper[4764]: E1001 16:03:17.720904 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.720955 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:17 crc kubenswrapper[4764]: E1001 16:03:17.721007 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:17 crc kubenswrapper[4764]: E1001 16:03:17.721090 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:17 crc kubenswrapper[4764]: E1001 16:03:17.721155 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.745694 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.761226 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.774571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.774623 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.774633 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.774652 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.774666 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:17Z","lastTransitionTime":"2025-10-01T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.780854 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.796580 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.809658 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.824814 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.842748 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.855178 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.867080 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.878686 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.878733 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.878745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.878762 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.878774 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:17Z","lastTransitionTime":"2025-10-01T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.882592 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.894690 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.911215 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.928816 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:11Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 16:03:11.583374 6425 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1001 16:03:11.583405 6425 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1001 16:03:11.583424 6425 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1001 16:03:11.583482 6425 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 16:03:11.583515 6425 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 16:03:11.583779 6425 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 16:03:11.583859 6425 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 16:03:11.583901 6425 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16:03:11.583931 6425 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 16:03:11.584009 6425 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.940152 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.958571 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.973998 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.981755 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.981795 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.981810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.981833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.981848 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:17Z","lastTransitionTime":"2025-10-01T16:03:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.986221 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:17 crc kubenswrapper[4764]: I1001 16:03:17.994687 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:17Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.083623 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.083661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.083669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.083680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.083688 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:18Z","lastTransitionTime":"2025-10-01T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.188482 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.188559 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.188584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.188618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.188642 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:18Z","lastTransitionTime":"2025-10-01T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.291133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.291170 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.291181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.291196 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.291208 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:18Z","lastTransitionTime":"2025-10-01T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.393515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.393550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.393559 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.393571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.393580 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:18Z","lastTransitionTime":"2025-10-01T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.496365 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.496445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.496469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.496497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.496515 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:18Z","lastTransitionTime":"2025-10-01T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.598792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.598845 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.598864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.598887 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.598902 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:18Z","lastTransitionTime":"2025-10-01T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.701322 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.701366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.701377 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.701398 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.701412 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:18Z","lastTransitionTime":"2025-10-01T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.804455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.804498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.804509 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.804534 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.804545 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:18Z","lastTransitionTime":"2025-10-01T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.907121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.907167 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.907181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.907202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:18 crc kubenswrapper[4764]: I1001 16:03:18.907217 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:18Z","lastTransitionTime":"2025-10-01T16:03:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.010245 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.010297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.010314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.010334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.010352 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:19Z","lastTransitionTime":"2025-10-01T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.112753 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.112805 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.112820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.112841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.112857 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:19Z","lastTransitionTime":"2025-10-01T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.215401 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.215455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.215470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.215490 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.215505 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:19Z","lastTransitionTime":"2025-10-01T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.402204 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.402241 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.402250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.402266 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.402276 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:19Z","lastTransitionTime":"2025-10-01T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.504713 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.504753 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.504762 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.504775 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.504785 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:19Z","lastTransitionTime":"2025-10-01T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.607901 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.607958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.607969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.607989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.608000 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:19Z","lastTransitionTime":"2025-10-01T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.709889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.709922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.709934 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.709957 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.709967 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:19Z","lastTransitionTime":"2025-10-01T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.721349 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.721420 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:19 crc kubenswrapper[4764]: E1001 16:03:19.721469 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:19 crc kubenswrapper[4764]: E1001 16:03:19.721537 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.721596 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:19 crc kubenswrapper[4764]: E1001 16:03:19.721636 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.721685 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:19 crc kubenswrapper[4764]: E1001 16:03:19.721731 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.811952 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.811991 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.812004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.812020 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.812032 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:19Z","lastTransitionTime":"2025-10-01T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.914613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.914704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.914735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.914763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:19 crc kubenswrapper[4764]: I1001 16:03:19.914784 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:19Z","lastTransitionTime":"2025-10-01T16:03:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.016751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.016792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.016822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.016842 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.016852 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:20Z","lastTransitionTime":"2025-10-01T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.118812 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.118866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.118879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.118897 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.118914 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:20Z","lastTransitionTime":"2025-10-01T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.221246 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.221309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.221321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.221337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.221349 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:20Z","lastTransitionTime":"2025-10-01T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.323608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.323658 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.323671 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.323687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.323697 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:20Z","lastTransitionTime":"2025-10-01T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.425803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.425874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.425899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.425926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.425955 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:20Z","lastTransitionTime":"2025-10-01T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.528485 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.528538 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.528549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.528565 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.528576 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:20Z","lastTransitionTime":"2025-10-01T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.630449 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.630491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.630500 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.630513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.630522 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:20Z","lastTransitionTime":"2025-10-01T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.732721 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.732768 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.732778 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.732793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.732802 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:20Z","lastTransitionTime":"2025-10-01T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.835267 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.835397 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.835412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.835430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.835439 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:20Z","lastTransitionTime":"2025-10-01T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.938982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.939126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.939162 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.939194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:20 crc kubenswrapper[4764]: I1001 16:03:20.939221 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:20Z","lastTransitionTime":"2025-10-01T16:03:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.041656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.041741 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.041753 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.041770 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.041782 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:21Z","lastTransitionTime":"2025-10-01T16:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.145239 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.145317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.145342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.145367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.145384 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:21Z","lastTransitionTime":"2025-10-01T16:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.247474 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.247513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.247529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.247544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.247555 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:21Z","lastTransitionTime":"2025-10-01T16:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.350741 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.350796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.350810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.350830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.350844 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:21Z","lastTransitionTime":"2025-10-01T16:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.453584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.453620 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.453638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.453654 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.453665 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:21Z","lastTransitionTime":"2025-10-01T16:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.556535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.556605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.556627 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.556652 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.556668 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:21Z","lastTransitionTime":"2025-10-01T16:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.659204 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.659244 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.659255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.659270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.659281 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:21Z","lastTransitionTime":"2025-10-01T16:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.724689 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:21 crc kubenswrapper[4764]: E1001 16:03:21.724854 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.724949 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:21 crc kubenswrapper[4764]: E1001 16:03:21.725029 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.725109 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:21 crc kubenswrapper[4764]: E1001 16:03:21.725189 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.725317 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:21 crc kubenswrapper[4764]: E1001 16:03:21.725433 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.761727 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.761776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.761788 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.761806 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.761817 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:21Z","lastTransitionTime":"2025-10-01T16:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.864763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.864800 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.864809 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.864822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.864832 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:21Z","lastTransitionTime":"2025-10-01T16:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.966993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.967030 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.967063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.967080 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:21 crc kubenswrapper[4764]: I1001 16:03:21.967093 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:21Z","lastTransitionTime":"2025-10-01T16:03:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.069067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.069112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.069122 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.069134 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.069142 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:22Z","lastTransitionTime":"2025-10-01T16:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.172191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.172277 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.172287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.172305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.172316 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:22Z","lastTransitionTime":"2025-10-01T16:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.274909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.274949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.274958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.275061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.275086 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:22Z","lastTransitionTime":"2025-10-01T16:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.377852 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.377899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.377913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.377931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.377943 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:22Z","lastTransitionTime":"2025-10-01T16:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.481317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.481547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.481564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.481583 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.481595 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:22Z","lastTransitionTime":"2025-10-01T16:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.584848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.584892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.584903 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.584916 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.584925 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:22Z","lastTransitionTime":"2025-10-01T16:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.686680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.686709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.686717 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.686729 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.686737 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:22Z","lastTransitionTime":"2025-10-01T16:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.789404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.789449 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.789460 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.789479 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.789489 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:22Z","lastTransitionTime":"2025-10-01T16:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.891870 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.891910 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.891925 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.891941 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.891951 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:22Z","lastTransitionTime":"2025-10-01T16:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.994360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.994397 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.994408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.994422 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:22 crc kubenswrapper[4764]: I1001 16:03:22.994433 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:22Z","lastTransitionTime":"2025-10-01T16:03:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.097485 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.097565 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.097582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.097606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.097618 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:23Z","lastTransitionTime":"2025-10-01T16:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.199849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.199889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.199897 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.199913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.199921 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:23Z","lastTransitionTime":"2025-10-01T16:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.302429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.302471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.302484 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.302505 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.302518 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:23Z","lastTransitionTime":"2025-10-01T16:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.404988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.405056 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.405075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.405089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.405098 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:23Z","lastTransitionTime":"2025-10-01T16:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.507236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.507272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.507282 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.507295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.507304 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:23Z","lastTransitionTime":"2025-10-01T16:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.609979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.610026 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.610067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.610088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.610103 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:23Z","lastTransitionTime":"2025-10-01T16:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.713279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.713320 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.713334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.713350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.713361 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:23Z","lastTransitionTime":"2025-10-01T16:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.720724 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.720810 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:23 crc kubenswrapper[4764]: E1001 16:03:23.720857 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.720923 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.721009 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:23 crc kubenswrapper[4764]: E1001 16:03:23.720968 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:23 crc kubenswrapper[4764]: E1001 16:03:23.721182 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:23 crc kubenswrapper[4764]: E1001 16:03:23.721270 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.817819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.817874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.817887 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.817905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.817920 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:23Z","lastTransitionTime":"2025-10-01T16:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.921373 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.921433 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.921451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.921473 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:23 crc kubenswrapper[4764]: I1001 16:03:23.921491 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:23Z","lastTransitionTime":"2025-10-01T16:03:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.024146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.024205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.024216 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.024233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.024244 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.126756 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.126805 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.126816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.126833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.126847 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.229014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.229075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.229088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.229104 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.229124 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.331007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.331060 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.331071 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.331087 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.331096 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.433737 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.433795 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.433805 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.433820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.433829 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.536038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.536087 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.536098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.536116 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.536146 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.588947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.588998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.589008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.589021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.589031 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: E1001 16:03:24.601674 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:24Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.604919 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.604955 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.604966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.605008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.605021 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: E1001 16:03:24.618272 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:24Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.622867 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.622904 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.622917 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.622934 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.622954 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: E1001 16:03:24.636623 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:24Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.640551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.640584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.640592 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.640606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.640615 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: E1001 16:03:24.654981 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:24Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.658443 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.658511 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.658528 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.658552 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.658568 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: E1001 16:03:24.671550 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:24Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:24 crc kubenswrapper[4764]: E1001 16:03:24.671707 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.673378 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.673417 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.673428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.673443 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.673453 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.776696 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.776741 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.776754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.776770 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.776783 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.879152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.879195 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.879209 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.879228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.879242 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.981015 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.981066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.981085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.981110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:24 crc kubenswrapper[4764]: I1001 16:03:24.981122 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:24Z","lastTransitionTime":"2025-10-01T16:03:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.083562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.083598 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.083606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.083619 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.083627 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:25Z","lastTransitionTime":"2025-10-01T16:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.186632 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.186666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.186677 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.186694 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.186704 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:25Z","lastTransitionTime":"2025-10-01T16:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.289308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.289353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.289370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.289390 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.289406 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:25Z","lastTransitionTime":"2025-10-01T16:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.392249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.392309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.392319 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.392337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.392347 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:25Z","lastTransitionTime":"2025-10-01T16:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.495735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.495797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.495809 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.495826 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.495837 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:25Z","lastTransitionTime":"2025-10-01T16:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.599698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.599758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.599775 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.599797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.599812 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:25Z","lastTransitionTime":"2025-10-01T16:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.703636 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.703686 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.703699 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.703715 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.703727 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:25Z","lastTransitionTime":"2025-10-01T16:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.721238 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:25 crc kubenswrapper[4764]: E1001 16:03:25.721364 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.721536 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:25 crc kubenswrapper[4764]: E1001 16:03:25.721637 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.721749 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:25 crc kubenswrapper[4764]: E1001 16:03:25.721798 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.721927 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:25 crc kubenswrapper[4764]: E1001 16:03:25.721984 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.806357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.806395 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.806408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.806423 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.806434 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:25Z","lastTransitionTime":"2025-10-01T16:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.909072 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.909106 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.909115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.909129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:25 crc kubenswrapper[4764]: I1001 16:03:25.909140 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:25Z","lastTransitionTime":"2025-10-01T16:03:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.012004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.012040 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.012064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.012078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.012087 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:26Z","lastTransitionTime":"2025-10-01T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.114930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.114970 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.114982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.115000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.115011 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:26Z","lastTransitionTime":"2025-10-01T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.217864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.217922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.217940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.217960 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.217972 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:26Z","lastTransitionTime":"2025-10-01T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.320563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.320647 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.320670 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.320705 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.320729 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:26Z","lastTransitionTime":"2025-10-01T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.422642 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.422678 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.422691 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.422708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.422719 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:26Z","lastTransitionTime":"2025-10-01T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.524817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.524853 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.524863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.524879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.524889 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:26Z","lastTransitionTime":"2025-10-01T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.628007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.628073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.628083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.628096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.628104 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:26Z","lastTransitionTime":"2025-10-01T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.722356 4764 scope.go:117] "RemoveContainer" containerID="fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea" Oct 01 16:03:26 crc kubenswrapper[4764]: E1001 16:03:26.722509 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.730251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.730285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.730296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.730311 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.730324 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:26Z","lastTransitionTime":"2025-10-01T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.833505 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.833542 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.833578 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.833597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.833608 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:26Z","lastTransitionTime":"2025-10-01T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.936504 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.936550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.936563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.936583 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:26 crc kubenswrapper[4764]: I1001 16:03:26.936595 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:26Z","lastTransitionTime":"2025-10-01T16:03:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.038917 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.038949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.038957 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.038970 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.038978 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:27Z","lastTransitionTime":"2025-10-01T16:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.141286 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.141332 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.141346 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.141363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.141376 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:27Z","lastTransitionTime":"2025-10-01T16:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.243633 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.243699 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.243802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.243829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.243842 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:27Z","lastTransitionTime":"2025-10-01T16:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.346500 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.346534 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.346543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.346556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.346566 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:27Z","lastTransitionTime":"2025-10-01T16:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.449000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.449063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.449075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.449092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.449111 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:27Z","lastTransitionTime":"2025-10-01T16:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.551353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.551398 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.551409 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.551428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.551440 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:27Z","lastTransitionTime":"2025-10-01T16:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.654143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.654191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.654202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.654217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.654228 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:27Z","lastTransitionTime":"2025-10-01T16:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.721748 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.721791 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.721837 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.721893 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:27 crc kubenswrapper[4764]: E1001 16:03:27.722008 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:27 crc kubenswrapper[4764]: E1001 16:03:27.722112 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:27 crc kubenswrapper[4764]: E1001 16:03:27.722155 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:27 crc kubenswrapper[4764]: E1001 16:03:27.722192 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.735025 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.745171 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.757125 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.757163 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.757174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.757195 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.757211 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:27Z","lastTransitionTime":"2025-10-01T16:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.757441 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.766487 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.778037 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.788531 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.802148 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.811711 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.824937 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.837893 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.848117 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.858688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.858710 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.858717 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.858730 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.858738 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:27Z","lastTransitionTime":"2025-10-01T16:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.859577 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.870115 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.886705 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:11Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 16:03:11.583374 6425 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1001 16:03:11.583405 6425 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1001 16:03:11.583424 6425 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1001 16:03:11.583482 6425 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 16:03:11.583515 6425 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 16:03:11.583779 6425 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 16:03:11.583859 6425 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 16:03:11.583901 6425 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16:03:11.583931 6425 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 16:03:11.584009 6425 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.904156 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.916510 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.927917 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.938734 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:27Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.961260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.961295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.961305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.961318 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:27 crc kubenswrapper[4764]: I1001 16:03:27.961329 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:27Z","lastTransitionTime":"2025-10-01T16:03:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.063780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.063823 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.063835 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.063850 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.063861 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:28Z","lastTransitionTime":"2025-10-01T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.165879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.166171 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.166284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.166350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.166414 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:28Z","lastTransitionTime":"2025-10-01T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.269114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.269141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.269153 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.269165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.269174 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:28Z","lastTransitionTime":"2025-10-01T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.371733 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.371793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.371805 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.371829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.371843 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:28Z","lastTransitionTime":"2025-10-01T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.474083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.474108 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.474117 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.474131 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.474139 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:28Z","lastTransitionTime":"2025-10-01T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.490650 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:28 crc kubenswrapper[4764]: E1001 16:03:28.490781 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:03:28 crc kubenswrapper[4764]: E1001 16:03:28.490821 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs podName:41a0358d-ae10-4282-9423-8f3599adbc2a nodeName:}" failed. No retries permitted until 2025-10-01 16:04:00.490807278 +0000 UTC m=+103.490454113 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs") pod "network-metrics-daemon-btbfp" (UID: "41a0358d-ae10-4282-9423-8f3599adbc2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.576186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.576261 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.576297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.576327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.576352 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:28Z","lastTransitionTime":"2025-10-01T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.679013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.679124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.679138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.679156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.679168 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:28Z","lastTransitionTime":"2025-10-01T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.782165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.782210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.782224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.782240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.782251 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:28Z","lastTransitionTime":"2025-10-01T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.884497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.884539 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.884549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.884567 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.884581 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:28Z","lastTransitionTime":"2025-10-01T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.986710 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.986748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.986759 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.986774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:28 crc kubenswrapper[4764]: I1001 16:03:28.986813 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:28Z","lastTransitionTime":"2025-10-01T16:03:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.088891 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.088926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.088938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.088954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.088965 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:29Z","lastTransitionTime":"2025-10-01T16:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.192020 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.192076 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.192088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.192104 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.192114 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:29Z","lastTransitionTime":"2025-10-01T16:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.294352 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.294403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.294440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.294462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.294472 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:29Z","lastTransitionTime":"2025-10-01T16:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.397317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.397358 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.397371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.397387 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.397399 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:29Z","lastTransitionTime":"2025-10-01T16:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.499458 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.499491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.499500 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.499514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.499523 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:29Z","lastTransitionTime":"2025-10-01T16:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.602039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.602095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.602105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.602120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.602131 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:29Z","lastTransitionTime":"2025-10-01T16:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.703893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.703947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.703960 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.703977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.703990 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:29Z","lastTransitionTime":"2025-10-01T16:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.721678 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.721712 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.721736 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.721717 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:29 crc kubenswrapper[4764]: E1001 16:03:29.721797 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:29 crc kubenswrapper[4764]: E1001 16:03:29.721958 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:29 crc kubenswrapper[4764]: E1001 16:03:29.722113 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:29 crc kubenswrapper[4764]: E1001 16:03:29.722220 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.806083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.806120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.806129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.806143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.806152 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:29Z","lastTransitionTime":"2025-10-01T16:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.908453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.908540 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.908562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.908591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:29 crc kubenswrapper[4764]: I1001 16:03:29.908612 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:29Z","lastTransitionTime":"2025-10-01T16:03:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.012123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.012165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.012176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.012192 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.012205 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:30Z","lastTransitionTime":"2025-10-01T16:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.110236 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks425_5499b593-79e4-408e-a32b-9e132d3a0de7/kube-multus/0.log" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.110290 4764 generic.go:334] "Generic (PLEG): container finished" podID="5499b593-79e4-408e-a32b-9e132d3a0de7" containerID="c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2" exitCode=1 Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.110322 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks425" event={"ID":"5499b593-79e4-408e-a32b-9e132d3a0de7","Type":"ContainerDied","Data":"c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2"} Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.110723 4764 scope.go:117] "RemoveContainer" containerID="c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.113694 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.113745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.113754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.113769 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.113778 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:30Z","lastTransitionTime":"2025-10-01T16:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.125794 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.138100 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.149329 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.166267 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.176817 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.188717 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.200906 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.212463 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.216089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.216129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.216140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.216157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.216168 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:30Z","lastTransitionTime":"2025-10-01T16:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.224751 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:30Z\\\",\\\"message\\\":\\\"2025-10-01T16:02:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533\\\\n2025-10-01T16:02:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533 to /host/opt/cni/bin/\\\\n2025-10-01T16:02:44Z [verbose] multus-daemon started\\\\n2025-10-01T16:02:44Z [verbose] Readiness Indicator file check\\\\n2025-10-01T16:03:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.243631 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:11Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 16:03:11.583374 6425 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1001 16:03:11.583405 6425 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1001 16:03:11.583424 6425 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1001 16:03:11.583482 6425 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 16:03:11.583515 6425 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 16:03:11.583779 6425 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 16:03:11.583859 6425 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 16:03:11.583901 6425 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16:03:11.583931 6425 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 16:03:11.584009 6425 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.255294 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.273823 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.286177 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.296710 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.305277 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.314964 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.318580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.318611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.318625 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.318642 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.318653 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:30Z","lastTransitionTime":"2025-10-01T16:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.325963 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.339517 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:30Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.420709 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.420768 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.420786 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.420805 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.420815 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:30Z","lastTransitionTime":"2025-10-01T16:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.523972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.524020 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.524031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.524070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.524083 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:30Z","lastTransitionTime":"2025-10-01T16:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.626257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.626284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.626292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.626308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.626318 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:30Z","lastTransitionTime":"2025-10-01T16:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.728734 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.728769 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.728780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.728794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.728804 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:30Z","lastTransitionTime":"2025-10-01T16:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.831229 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.831274 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.831283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.831297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.831308 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:30Z","lastTransitionTime":"2025-10-01T16:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.933366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.933402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.933410 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.933425 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:30 crc kubenswrapper[4764]: I1001 16:03:30.933434 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:30Z","lastTransitionTime":"2025-10-01T16:03:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.036151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.036197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.036208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.036222 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.036233 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:31Z","lastTransitionTime":"2025-10-01T16:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.114951 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks425_5499b593-79e4-408e-a32b-9e132d3a0de7/kube-multus/0.log" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.115020 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks425" event={"ID":"5499b593-79e4-408e-a32b-9e132d3a0de7","Type":"ContainerStarted","Data":"a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d"} Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.133038 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.143442 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.143498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.143515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.143532 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.143543 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:31Z","lastTransitionTime":"2025-10-01T16:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.148783 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.158735 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.184540 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.195908 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.209583 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.225166 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.239110 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.247228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.247263 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.247271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.247286 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.247296 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:31Z","lastTransitionTime":"2025-10-01T16:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.255956 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.266828 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.279078 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.288825 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.302030 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.313654 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.326157 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.342429 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:30Z\\\",\\\"message\\\":\\\"2025-10-01T16:02:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533\\\\n2025-10-01T16:02:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533 to /host/opt/cni/bin/\\\\n2025-10-01T16:02:44Z [verbose] multus-daemon started\\\\n2025-10-01T16:02:44Z [verbose] Readiness Indicator file check\\\\n2025-10-01T16:03:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.349558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.349894 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.349993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.350114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.350204 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:31Z","lastTransitionTime":"2025-10-01T16:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.360398 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:11Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 16:03:11.583374 6425 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1001 16:03:11.583405 6425 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1001 16:03:11.583424 6425 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1001 16:03:11.583482 6425 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 16:03:11.583515 6425 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 16:03:11.583779 6425 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 16:03:11.583859 6425 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 16:03:11.583901 6425 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16:03:11.583931 6425 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 16:03:11.584009 6425 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.372203 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:31Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.452454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.452518 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.452537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.452562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.452579 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:31Z","lastTransitionTime":"2025-10-01T16:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.555189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.555470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.555555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.555656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.555737 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:31Z","lastTransitionTime":"2025-10-01T16:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.657715 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.657761 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.657773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.657790 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.657801 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:31Z","lastTransitionTime":"2025-10-01T16:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.721347 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.721416 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.721448 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.721451 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:31 crc kubenswrapper[4764]: E1001 16:03:31.721565 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:31 crc kubenswrapper[4764]: E1001 16:03:31.721838 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:31 crc kubenswrapper[4764]: E1001 16:03:31.721872 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:31 crc kubenswrapper[4764]: E1001 16:03:31.721930 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.760124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.760179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.760193 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.760209 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.760221 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:31Z","lastTransitionTime":"2025-10-01T16:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.862666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.862717 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.862732 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.862752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.862765 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:31Z","lastTransitionTime":"2025-10-01T16:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.964995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.965066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.965078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.965098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:31 crc kubenswrapper[4764]: I1001 16:03:31.965112 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:31Z","lastTransitionTime":"2025-10-01T16:03:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.067979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.068028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.068065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.068083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.068097 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:32Z","lastTransitionTime":"2025-10-01T16:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.169995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.170035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.170067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.170082 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.170094 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:32Z","lastTransitionTime":"2025-10-01T16:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.272546 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.272604 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.272618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.272633 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.272643 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:32Z","lastTransitionTime":"2025-10-01T16:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.374685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.374722 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.374734 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.374752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.374763 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:32Z","lastTransitionTime":"2025-10-01T16:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.477530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.477570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.477584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.477600 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.477611 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:32Z","lastTransitionTime":"2025-10-01T16:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.579480 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.579542 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.579557 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.579573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.579586 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:32Z","lastTransitionTime":"2025-10-01T16:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.682258 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.682310 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.682325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.682345 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.682359 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:32Z","lastTransitionTime":"2025-10-01T16:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.784403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.784432 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.784440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.784454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.784462 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:32Z","lastTransitionTime":"2025-10-01T16:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.886362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.886395 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.886408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.886422 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.886432 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:32Z","lastTransitionTime":"2025-10-01T16:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.988756 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.988787 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.988795 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.988807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:32 crc kubenswrapper[4764]: I1001 16:03:32.988816 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:32Z","lastTransitionTime":"2025-10-01T16:03:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.091851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.091915 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.091932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.091950 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.091961 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:33Z","lastTransitionTime":"2025-10-01T16:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.195478 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.195545 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.195559 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.195577 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.195591 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:33Z","lastTransitionTime":"2025-10-01T16:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.297991 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.298067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.298076 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.298091 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.298100 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:33Z","lastTransitionTime":"2025-10-01T16:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.400824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.400867 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.400877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.400892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.400901 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:33Z","lastTransitionTime":"2025-10-01T16:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.503184 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.503234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.503244 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.503263 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.503279 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:33Z","lastTransitionTime":"2025-10-01T16:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.606721 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.607208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.607220 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.607236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.607245 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:33Z","lastTransitionTime":"2025-10-01T16:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.710124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.710173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.710188 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.710203 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.710214 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:33Z","lastTransitionTime":"2025-10-01T16:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.721663 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.721765 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.721765 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:33 crc kubenswrapper[4764]: E1001 16:03:33.721847 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:33 crc kubenswrapper[4764]: E1001 16:03:33.721912 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:33 crc kubenswrapper[4764]: E1001 16:03:33.722024 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.722072 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:33 crc kubenswrapper[4764]: E1001 16:03:33.722217 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.813952 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.813995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.814007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.814024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.814035 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:33Z","lastTransitionTime":"2025-10-01T16:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.916681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.916718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.916726 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.916741 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:33 crc kubenswrapper[4764]: I1001 16:03:33.916749 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:33Z","lastTransitionTime":"2025-10-01T16:03:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.019797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.019902 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.019927 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.019998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.020021 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.122003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.122175 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.122191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.122907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.123008 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.226389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.226444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.226456 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.226476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.226488 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.329953 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.330015 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.330029 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.330068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.330083 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.432549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.432600 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.432611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.432629 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.432640 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.535327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.535375 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.535387 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.535402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.535413 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.638095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.638144 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.638157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.638174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.638187 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.740700 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.740760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.740772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.740791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.740804 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.822767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.822823 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.822834 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.822854 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.822866 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: E1001 16:03:34.836838 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:34Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.840909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.840975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.840985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.841019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.841031 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: E1001 16:03:34.855462 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:34Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.859151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.859182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.859194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.859211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.859222 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: E1001 16:03:34.873642 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:34Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.877704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.877748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.877758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.877770 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.877781 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: E1001 16:03:34.898191 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:34Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.907280 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.907349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.907365 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.907519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.907639 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:34 crc kubenswrapper[4764]: E1001 16:03:34.923160 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:34Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:34 crc kubenswrapper[4764]: E1001 16:03:34.923368 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.925093 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.925146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.925156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.925173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:34 crc kubenswrapper[4764]: I1001 16:03:34.925185 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:34Z","lastTransitionTime":"2025-10-01T16:03:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.027848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.027890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.027900 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.027914 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.027923 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:35Z","lastTransitionTime":"2025-10-01T16:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.130319 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.130362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.130373 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.130388 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.130401 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:35Z","lastTransitionTime":"2025-10-01T16:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.233767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.233811 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.233822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.233837 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.233848 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:35Z","lastTransitionTime":"2025-10-01T16:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.336795 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.336839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.336848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.336863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.336873 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:35Z","lastTransitionTime":"2025-10-01T16:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.439755 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.439842 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.439860 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.439884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.439903 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:35Z","lastTransitionTime":"2025-10-01T16:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.542472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.542526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.542545 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.542566 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.542583 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:35Z","lastTransitionTime":"2025-10-01T16:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.644744 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.644784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.644796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.644811 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.644821 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:35Z","lastTransitionTime":"2025-10-01T16:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.721263 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.721320 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.721365 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.721434 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:35 crc kubenswrapper[4764]: E1001 16:03:35.721647 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:35 crc kubenswrapper[4764]: E1001 16:03:35.721759 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:35 crc kubenswrapper[4764]: E1001 16:03:35.721893 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:35 crc kubenswrapper[4764]: E1001 16:03:35.721968 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.754662 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.754717 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.754728 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.754746 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.754757 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:35Z","lastTransitionTime":"2025-10-01T16:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.858531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.858592 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.858605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.858625 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.858637 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:35Z","lastTransitionTime":"2025-10-01T16:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.961233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.961293 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.961310 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.961334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:35 crc kubenswrapper[4764]: I1001 16:03:35.961352 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:35Z","lastTransitionTime":"2025-10-01T16:03:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.064185 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.064247 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.064259 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.064280 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.064297 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:36Z","lastTransitionTime":"2025-10-01T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.167818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.167874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.167887 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.167907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.167920 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:36Z","lastTransitionTime":"2025-10-01T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.270431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.270489 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.270508 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.270537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.270552 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:36Z","lastTransitionTime":"2025-10-01T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.373716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.373784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.373810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.373834 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.373852 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:36Z","lastTransitionTime":"2025-10-01T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.476293 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.476369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.476381 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.476399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.476434 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:36Z","lastTransitionTime":"2025-10-01T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.579504 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.579544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.579554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.579583 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.579592 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:36Z","lastTransitionTime":"2025-10-01T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.681092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.681144 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.681166 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.681185 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.681198 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:36Z","lastTransitionTime":"2025-10-01T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.782888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.782931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.782942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.782961 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.782974 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:36Z","lastTransitionTime":"2025-10-01T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.885499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.885534 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.885543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.885556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.885565 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:36Z","lastTransitionTime":"2025-10-01T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.988200 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.988276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.988290 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.988306 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:36 crc kubenswrapper[4764]: I1001 16:03:36.988316 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:36Z","lastTransitionTime":"2025-10-01T16:03:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.091198 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.091243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.091254 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.091271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.091281 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:37Z","lastTransitionTime":"2025-10-01T16:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.194286 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.194317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.194325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.194339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.194349 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:37Z","lastTransitionTime":"2025-10-01T16:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.296516 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.296550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.296558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.296572 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.296580 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:37Z","lastTransitionTime":"2025-10-01T16:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.398613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.398644 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.398654 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.398688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.398699 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:37Z","lastTransitionTime":"2025-10-01T16:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.501888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.501957 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.501975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.501998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.502015 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:37Z","lastTransitionTime":"2025-10-01T16:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.605246 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.605335 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.605389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.606191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.606326 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:37Z","lastTransitionTime":"2025-10-01T16:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.709955 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.710112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.710131 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.710155 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.710173 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:37Z","lastTransitionTime":"2025-10-01T16:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.721673 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.721737 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.721770 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.721800 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:37 crc kubenswrapper[4764]: E1001 16:03:37.721884 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:37 crc kubenswrapper[4764]: E1001 16:03:37.721986 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:37 crc kubenswrapper[4764]: E1001 16:03:37.722420 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:37 crc kubenswrapper[4764]: E1001 16:03:37.722586 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.735764 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.747575 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.760568 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.794926 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.811577 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.813406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.813456 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.813469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.813484 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.813511 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:37Z","lastTransitionTime":"2025-10-01T16:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.826310 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.842826 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.854827 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.866018 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.875400 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.886849 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.899588 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.916525 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.916560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.916571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.916587 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.916599 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:37Z","lastTransitionTime":"2025-10-01T16:03:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.916632 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.929293 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.943912 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.956569 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:30Z\\\",\\\"message\\\":\\\"2025-10-01T16:02:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533\\\\n2025-10-01T16:02:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533 to /host/opt/cni/bin/\\\\n2025-10-01T16:02:44Z [verbose] multus-daemon started\\\\n2025-10-01T16:02:44Z [verbose] Readiness Indicator file check\\\\n2025-10-01T16:03:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:37 crc kubenswrapper[4764]: I1001 16:03:37.988266 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:11Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 16:03:11.583374 6425 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1001 16:03:11.583405 6425 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1001 16:03:11.583424 6425 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1001 16:03:11.583482 6425 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 16:03:11.583515 6425 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 16:03:11.583779 6425 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 16:03:11.583859 6425 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 16:03:11.583901 6425 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16:03:11.583931 6425 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 16:03:11.584009 6425 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:37Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.002286 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:38Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.013988 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:38Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.018363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.018399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.018411 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.018428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.018439 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:38Z","lastTransitionTime":"2025-10-01T16:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.120888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.120927 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.120938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.120953 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.120966 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:38Z","lastTransitionTime":"2025-10-01T16:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.223506 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.223571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.223589 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.223613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.223630 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:38Z","lastTransitionTime":"2025-10-01T16:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.326465 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.326511 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.326523 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.326539 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.326552 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:38Z","lastTransitionTime":"2025-10-01T16:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.429355 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.429406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.429415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.429428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.429437 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:38Z","lastTransitionTime":"2025-10-01T16:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.531937 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.532012 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.532029 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.532064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.532082 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:38Z","lastTransitionTime":"2025-10-01T16:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.635521 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.635563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.635578 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.635600 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.635614 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:38Z","lastTransitionTime":"2025-10-01T16:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.722528 4764 scope.go:117] "RemoveContainer" containerID="fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.738491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.738550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.738570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.738594 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.738618 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:38Z","lastTransitionTime":"2025-10-01T16:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.842281 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.842323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.842336 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.842356 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.842372 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:38Z","lastTransitionTime":"2025-10-01T16:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.944933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.944995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.945008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.945028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:38 crc kubenswrapper[4764]: I1001 16:03:38.945064 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:38Z","lastTransitionTime":"2025-10-01T16:03:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.047968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.048072 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.048091 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.048138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.048156 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:39Z","lastTransitionTime":"2025-10-01T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.140770 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/2.log" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.143320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerStarted","Data":"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a"} Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.143889 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.150699 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.150737 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.150748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.150762 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.150772 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:39Z","lastTransitionTime":"2025-10-01T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.162325 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.175314 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.193265 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.210736 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.226938 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.252347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.252381 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.252389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.252403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.252412 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:39Z","lastTransitionTime":"2025-10-01T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.255788 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.284513 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.297784 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.309518 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.321580 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0ec5e3-7361-47d0-8933-4b35d10037fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d67eb7f641077dbdc2785600e9a2efc1c3e75dcafa93923f6a0ccc9b577cad07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.333038 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.343818 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.354946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.354991 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.355004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.355023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.355036 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:39Z","lastTransitionTime":"2025-10-01T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.358902 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:30Z\\\",\\\"message\\\":\\\"2025-10-01T16:02:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533\\\\n2025-10-01T16:02:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533 to /host/opt/cni/bin/\\\\n2025-10-01T16:02:44Z [verbose] multus-daemon started\\\\n2025-10-01T16:02:44Z [verbose] Readiness Indicator file check\\\\n2025-10-01T16:03:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.382721 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:11Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 16:03:11.583374 6425 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1001 16:03:11.583405 6425 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1001 16:03:11.583424 6425 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1001 16:03:11.583482 6425 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 16:03:11.583515 6425 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 16:03:11.583779 6425 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 16:03:11.583859 6425 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 16:03:11.583901 6425 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16:03:11.583931 6425 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 16:03:11.584009 6425 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.391388 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.413484 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.425347 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.437488 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.446926 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.457205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.457244 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.457260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.457275 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.457284 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:39Z","lastTransitionTime":"2025-10-01T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.559455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.559514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.559527 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.559548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.559562 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:39Z","lastTransitionTime":"2025-10-01T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.662213 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.662254 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.662263 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.662279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.662288 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:39Z","lastTransitionTime":"2025-10-01T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.721154 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.721216 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.721180 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:39 crc kubenswrapper[4764]: E1001 16:03:39.721287 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.721161 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:39 crc kubenswrapper[4764]: E1001 16:03:39.721403 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:39 crc kubenswrapper[4764]: E1001 16:03:39.721492 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:39 crc kubenswrapper[4764]: E1001 16:03:39.721607 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.764371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.764414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.764425 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.764440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.764451 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:39Z","lastTransitionTime":"2025-10-01T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.866772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.866824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.866840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.866861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.866873 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:39Z","lastTransitionTime":"2025-10-01T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.969498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.969580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.969604 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.969635 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:39 crc kubenswrapper[4764]: I1001 16:03:39.969657 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:39Z","lastTransitionTime":"2025-10-01T16:03:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.073178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.073257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.073280 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.073309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.073330 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:40Z","lastTransitionTime":"2025-10-01T16:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.150023 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/3.log" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.150901 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/2.log" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.154732 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerID="fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a" exitCode=1 Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.154774 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a"} Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.154864 4764 scope.go:117] "RemoveContainer" containerID="fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.155536 4764 scope.go:117] "RemoveContainer" containerID="fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a" Oct 01 16:03:40 crc kubenswrapper[4764]: E1001 16:03:40.155740 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.175806 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.176190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.176205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.176226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.176240 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:40Z","lastTransitionTime":"2025-10-01T16:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.187444 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.208507 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.223931 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:30Z\\\",\\\"message\\\":\\\"2025-10-01T16:02:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533\\\\n2025-10-01T16:02:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533 to /host/opt/cni/bin/\\\\n2025-10-01T16:02:44Z [verbose] multus-daemon started\\\\n2025-10-01T16:02:44Z [verbose] Readiness Indicator file check\\\\n2025-10-01T16:03:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.246469 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc9a67ceb77a9c2e2ce66ce20d57475a535069afb27a408aabd6ca1e78459dea\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:11Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 16:03:11.583374 6425 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1001 16:03:11.583405 6425 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1001 16:03:11.583424 6425 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1001 16:03:11.583482 6425 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 16:03:11.583515 6425 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 16:03:11.583779 6425 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 16:03:11.583859 6425 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 16:03:11.583901 6425 ovnkube.go:599] Stopped ovnkube\\\\nI1001 16:03:11.583931 6425 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 16:03:11.584009 6425 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:39Z\\\",\\\"message\\\":\\\"uring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1001 16:03:39.584179 6817 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1001 16:03:39.584183 6817 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1001 16:03:39.583527 6817 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z]\\\\nI1001 16:03:39.583998 6817 services_controller.go:451] Built service openshift-service-ca-oper\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.258848 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.268831 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0ec5e3-7361-47d0-8933-4b35d10037fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d67eb7f641077dbdc2785600e9a2efc1c3e75dcafa93923f6a0ccc9b577cad07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.277923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.277952 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.277962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.277976 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.277988 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:40Z","lastTransitionTime":"2025-10-01T16:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.284383 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.302006 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.310761 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.332068 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.348633 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.363200 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.377610 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.380796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.380824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.380833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.380848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.380856 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:40Z","lastTransitionTime":"2025-10-01T16:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.389741 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.403350 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.416615 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.428601 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.440982 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.456708 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:40Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.483489 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.483554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.483572 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.483600 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.483615 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:40Z","lastTransitionTime":"2025-10-01T16:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.586868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.586928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.586951 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.586977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.586995 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:40Z","lastTransitionTime":"2025-10-01T16:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.690402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.690469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.690489 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.690514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.690529 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:40Z","lastTransitionTime":"2025-10-01T16:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.794241 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.794637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.794669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.799731 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.799805 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:40Z","lastTransitionTime":"2025-10-01T16:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.902508 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.902580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.902597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.902615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:40 crc kubenswrapper[4764]: I1001 16:03:40.902627 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:40Z","lastTransitionTime":"2025-10-01T16:03:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.004966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.005008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.005019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.005035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.005075 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:41Z","lastTransitionTime":"2025-10-01T16:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.108430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.108501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.108528 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.108560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.108583 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:41Z","lastTransitionTime":"2025-10-01T16:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.161318 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/3.log" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.166872 4764 scope.go:117] "RemoveContainer" containerID="fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a" Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.167178 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.181119 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.202005 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.211374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.211424 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.211437 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.211454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.211469 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:41Z","lastTransitionTime":"2025-10-01T16:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.225833 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.244063 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.256641 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.270338 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.279870 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.294480 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.310382 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.314812 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.315085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.315150 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.315264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.315345 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:41Z","lastTransitionTime":"2025-10-01T16:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.325674 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0ec5e3-7361-47d0-8933-4b35d10037fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d67eb7f641077dbdc2785600e9a2efc1c3e75dcafa93923f6a0ccc9b577cad07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.338847 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.351699 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.365310 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:30Z\\\",\\\"message\\\":\\\"2025-10-01T16:02:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533\\\\n2025-10-01T16:02:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533 to /host/opt/cni/bin/\\\\n2025-10-01T16:02:44Z [verbose] multus-daemon started\\\\n2025-10-01T16:02:44Z [verbose] Readiness Indicator file check\\\\n2025-10-01T16:03:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.387120 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:39Z\\\",\\\"message\\\":\\\"uring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1001 16:03:39.584179 6817 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1001 16:03:39.584183 6817 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1001 16:03:39.583527 6817 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z]\\\\nI1001 16:03:39.583998 6817 services_controller.go:451] Built service openshift-service-ca-oper\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.399789 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.418165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.418526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.418655 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.418825 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.418962 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:41Z","lastTransitionTime":"2025-10-01T16:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.424033 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.437239 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.450623 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.463941 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:41Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.523402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.523455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.523468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.523485 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.523499 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:41Z","lastTransitionTime":"2025-10-01T16:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.525211 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.525358 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:45.525337493 +0000 UTC m=+148.524984328 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.525402 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.525480 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.525550 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.525596 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:04:45.52558729 +0000 UTC m=+148.525234125 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.525609 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.525671 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:04:45.525652091 +0000 UTC m=+148.525298926 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.625940 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.626010 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.626134 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.626165 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.626179 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.626182 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.626202 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.626217 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.626240 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 16:04:45.626224697 +0000 UTC m=+148.625871532 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.626267 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 16:04:45.626251648 +0000 UTC m=+148.625898493 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.626998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.627025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.627034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.627068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.627081 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:41Z","lastTransitionTime":"2025-10-01T16:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.721846 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.722274 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.721918 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.722546 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.721901 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.722767 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.721920 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:41 crc kubenswrapper[4764]: E1001 16:03:41.722988 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.728972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.729018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.729031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.729067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.729079 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:41Z","lastTransitionTime":"2025-10-01T16:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.831643 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.831700 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.831708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.831720 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.831729 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:41Z","lastTransitionTime":"2025-10-01T16:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.934522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.934588 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.934606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.934636 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:41 crc kubenswrapper[4764]: I1001 16:03:41.934654 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:41Z","lastTransitionTime":"2025-10-01T16:03:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.037557 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.037623 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.037644 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.038140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.038185 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:42Z","lastTransitionTime":"2025-10-01T16:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.141000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.141067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.141078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.141095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.141105 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:42Z","lastTransitionTime":"2025-10-01T16:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.243650 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.243716 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.243727 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.243748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.243760 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:42Z","lastTransitionTime":"2025-10-01T16:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.346161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.346200 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.346213 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.346229 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.346242 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:42Z","lastTransitionTime":"2025-10-01T16:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.453492 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.453582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.453605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.453635 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.453663 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:42Z","lastTransitionTime":"2025-10-01T16:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.556765 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.556815 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.556827 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.556846 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.556859 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:42Z","lastTransitionTime":"2025-10-01T16:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.658779 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.658820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.658831 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.658847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.658859 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:42Z","lastTransitionTime":"2025-10-01T16:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.760887 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.760932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.760943 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.760962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.760976 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:42Z","lastTransitionTime":"2025-10-01T16:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.864083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.864123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.864134 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.864152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.864165 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:42Z","lastTransitionTime":"2025-10-01T16:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.967121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.967168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.967178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.967195 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:42 crc kubenswrapper[4764]: I1001 16:03:42.967206 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:42Z","lastTransitionTime":"2025-10-01T16:03:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.071270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.071336 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.071349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.071371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.071383 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:43Z","lastTransitionTime":"2025-10-01T16:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.172874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.173132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.173361 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.173531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.173712 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:43Z","lastTransitionTime":"2025-10-01T16:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.276587 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.276912 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.277021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.277302 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.277539 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:43Z","lastTransitionTime":"2025-10-01T16:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.380486 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.380515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.380523 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.380536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.380544 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:43Z","lastTransitionTime":"2025-10-01T16:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.482589 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.482645 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.482727 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.482753 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.482770 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:43Z","lastTransitionTime":"2025-10-01T16:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.585898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.585958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.585976 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.586000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.586019 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:43Z","lastTransitionTime":"2025-10-01T16:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.688908 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.688973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.688995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.689023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.689043 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:43Z","lastTransitionTime":"2025-10-01T16:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.722319 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:43 crc kubenswrapper[4764]: E1001 16:03:43.722477 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.722700 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:43 crc kubenswrapper[4764]: E1001 16:03:43.722779 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.722917 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:43 crc kubenswrapper[4764]: E1001 16:03:43.722989 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.723184 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:43 crc kubenswrapper[4764]: E1001 16:03:43.723323 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.792336 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.792382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.792392 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.792406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.792416 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:43Z","lastTransitionTime":"2025-10-01T16:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.897382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.897442 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.897464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.897489 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:43 crc kubenswrapper[4764]: I1001 16:03:43.897509 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:43Z","lastTransitionTime":"2025-10-01T16:03:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.001233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.001298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.001314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.001339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.001357 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:44Z","lastTransitionTime":"2025-10-01T16:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.104161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.104228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.104309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.104335 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.104352 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:44Z","lastTransitionTime":"2025-10-01T16:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.206424 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.206465 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.206476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.206491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.206502 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:44Z","lastTransitionTime":"2025-10-01T16:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.309368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.309423 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.309443 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.309462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.309474 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:44Z","lastTransitionTime":"2025-10-01T16:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.412757 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.412820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.412836 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.412858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.412874 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:44Z","lastTransitionTime":"2025-10-01T16:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.515719 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.515774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.515785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.515802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.515814 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:44Z","lastTransitionTime":"2025-10-01T16:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.618100 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.618149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.618166 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.618189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.618206 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:44Z","lastTransitionTime":"2025-10-01T16:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.721494 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.721536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.721547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.721559 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.721569 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:44Z","lastTransitionTime":"2025-10-01T16:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.823725 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.823777 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.823797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.823815 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.823826 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:44Z","lastTransitionTime":"2025-10-01T16:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.925613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.925697 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.925718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.925743 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:44 crc kubenswrapper[4764]: I1001 16:03:44.925764 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:44Z","lastTransitionTime":"2025-10-01T16:03:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.003231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.003319 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.003328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.003341 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.003350 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: E1001 16:03:45.014665 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.019229 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.019287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.019299 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.019314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.019324 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: E1001 16:03:45.033307 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.036982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.037121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.037197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.037271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.037342 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: E1001 16:03:45.050064 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.053474 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.053498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.053516 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.053531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.053542 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: E1001 16:03:45.065791 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.069451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.069477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.069486 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.069498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.069507 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: E1001 16:03:45.086237 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:45Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:45 crc kubenswrapper[4764]: E1001 16:03:45.086355 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.087840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.087870 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.087880 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.087896 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.087906 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.190918 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.190971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.190987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.191008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.191023 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.293612 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.293674 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.293687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.293707 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.293720 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.397126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.397187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.397198 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.397217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.397595 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.499863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.499902 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.499915 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.499930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.499938 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.602189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.602233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.602245 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.602263 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.602274 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.705214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.705257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.705268 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.705285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.705297 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.721309 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.721351 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.721432 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:45 crc kubenswrapper[4764]: E1001 16:03:45.721636 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:45 crc kubenswrapper[4764]: E1001 16:03:45.721799 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:45 crc kubenswrapper[4764]: E1001 16:03:45.721896 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.722194 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:45 crc kubenswrapper[4764]: E1001 16:03:45.722327 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.808986 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.809043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.809089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.809115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.809132 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.911143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.911181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.911191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.911204 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:45 crc kubenswrapper[4764]: I1001 16:03:45.911213 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:45Z","lastTransitionTime":"2025-10-01T16:03:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.013838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.013936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.013959 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.013987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.014010 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:46Z","lastTransitionTime":"2025-10-01T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.117690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.117750 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.117767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.117791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.117809 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:46Z","lastTransitionTime":"2025-10-01T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.220826 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.220882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.220899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.220921 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.220941 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:46Z","lastTransitionTime":"2025-10-01T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.323770 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.323813 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.323824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.323839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.323850 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:46Z","lastTransitionTime":"2025-10-01T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.426695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.426751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.426766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.426786 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.426801 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:46Z","lastTransitionTime":"2025-10-01T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.528662 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.528714 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.528731 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.528753 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.528769 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:46Z","lastTransitionTime":"2025-10-01T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.633528 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.633567 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.633576 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.633591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.633602 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:46Z","lastTransitionTime":"2025-10-01T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.735859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.735920 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.735935 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.735958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.735976 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:46Z","lastTransitionTime":"2025-10-01T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.839596 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.839666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.839685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.839735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.839754 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:46Z","lastTransitionTime":"2025-10-01T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.942707 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.942752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.942763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.942782 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:46 crc kubenswrapper[4764]: I1001 16:03:46.942793 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:46Z","lastTransitionTime":"2025-10-01T16:03:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.045330 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.045391 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.045410 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.045434 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.045453 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:47Z","lastTransitionTime":"2025-10-01T16:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.148214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.148278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.148289 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.148306 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.148316 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:47Z","lastTransitionTime":"2025-10-01T16:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.251431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.251468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.251476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.251491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.251502 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:47Z","lastTransitionTime":"2025-10-01T16:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.354947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.355018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.355031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.355090 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.355105 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:47Z","lastTransitionTime":"2025-10-01T16:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.458422 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.458465 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.458476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.458492 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.458503 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:47Z","lastTransitionTime":"2025-10-01T16:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.561700 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.561751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.561762 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.561777 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.561786 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:47Z","lastTransitionTime":"2025-10-01T16:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.664571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.664624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.664638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.664657 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.664666 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:47Z","lastTransitionTime":"2025-10-01T16:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.721311 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.721377 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.721319 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.721319 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:47 crc kubenswrapper[4764]: E1001 16:03:47.721601 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:47 crc kubenswrapper[4764]: E1001 16:03:47.721728 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:47 crc kubenswrapper[4764]: E1001 16:03:47.721876 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:47 crc kubenswrapper[4764]: E1001 16:03:47.721963 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.738276 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.751452 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.764235 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.766919 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.766970 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.766988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.767004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.767014 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:47Z","lastTransitionTime":"2025-10-01T16:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.778644 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.790401 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.803205 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.816065 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.834486 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.848733 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.861867 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.869634 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.869675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.869684 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.869700 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.869711 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:47Z","lastTransitionTime":"2025-10-01T16:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.876592 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:30Z\\\",\\\"message\\\":\\\"2025-10-01T16:02:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533\\\\n2025-10-01T16:02:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533 to /host/opt/cni/bin/\\\\n2025-10-01T16:02:44Z [verbose] multus-daemon started\\\\n2025-10-01T16:02:44Z [verbose] Readiness Indicator file check\\\\n2025-10-01T16:03:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.895163 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:39Z\\\",\\\"message\\\":\\\"uring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1001 16:03:39.584179 6817 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1001 16:03:39.584183 6817 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1001 16:03:39.583527 6817 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z]\\\\nI1001 16:03:39.583998 6817 services_controller.go:451] Built service openshift-service-ca-oper\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.909323 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.921780 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0ec5e3-7361-47d0-8933-4b35d10037fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d67eb7f641077dbdc2785600e9a2efc1c3e75dcafa93923f6a0ccc9b577cad07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.936124 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.950477 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.961130 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.972792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.972841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.972852 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.972870 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.972882 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:47Z","lastTransitionTime":"2025-10-01T16:03:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.980004 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:47 crc kubenswrapper[4764]: I1001 16:03:47.999436 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:47Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.075664 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.075744 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.075761 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.075791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.075810 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:48Z","lastTransitionTime":"2025-10-01T16:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.182581 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.182646 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.182660 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.182683 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.182743 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:48Z","lastTransitionTime":"2025-10-01T16:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.286237 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.286312 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.286328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.286352 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.286370 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:48Z","lastTransitionTime":"2025-10-01T16:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.389501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.389614 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.389632 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.389685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.389699 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:48Z","lastTransitionTime":"2025-10-01T16:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.492134 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.492179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.492190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.492205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.492215 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:48Z","lastTransitionTime":"2025-10-01T16:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.594802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.594871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.594893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.594920 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.594939 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:48Z","lastTransitionTime":"2025-10-01T16:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.697424 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.697464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.697472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.697487 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.697495 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:48Z","lastTransitionTime":"2025-10-01T16:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.799668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.799722 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.799739 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.799760 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.799776 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:48Z","lastTransitionTime":"2025-10-01T16:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.902675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.902745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.902768 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.902798 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:48 crc kubenswrapper[4764]: I1001 16:03:48.902832 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:48Z","lastTransitionTime":"2025-10-01T16:03:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.005211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.005238 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.005245 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.005258 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.005266 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:49Z","lastTransitionTime":"2025-10-01T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.106976 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.107005 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.107013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.107026 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.107034 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:49Z","lastTransitionTime":"2025-10-01T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.210085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.210138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.210152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.210171 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.210185 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:49Z","lastTransitionTime":"2025-10-01T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.313179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.313295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.313321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.313353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.313379 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:49Z","lastTransitionTime":"2025-10-01T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.415691 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.415785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.415800 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.415824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.415839 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:49Z","lastTransitionTime":"2025-10-01T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.518427 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.518507 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.518526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.518556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.518605 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:49Z","lastTransitionTime":"2025-10-01T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.622415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.622486 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.622510 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.622539 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.622562 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:49Z","lastTransitionTime":"2025-10-01T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.721603 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.721704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:49 crc kubenswrapper[4764]: E1001 16:03:49.721775 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.721613 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:49 crc kubenswrapper[4764]: E1001 16:03:49.721886 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:49 crc kubenswrapper[4764]: E1001 16:03:49.722181 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.722673 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:49 crc kubenswrapper[4764]: E1001 16:03:49.723030 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.724423 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.724706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.724920 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.725254 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.725484 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:49Z","lastTransitionTime":"2025-10-01T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.828941 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.829082 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.829109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.829137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.829162 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:49Z","lastTransitionTime":"2025-10-01T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.931809 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.932161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.932334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.932501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:49 crc kubenswrapper[4764]: I1001 16:03:49.932652 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:49Z","lastTransitionTime":"2025-10-01T16:03:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.035910 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.035956 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.035964 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.035981 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.035996 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:50Z","lastTransitionTime":"2025-10-01T16:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.138480 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.138572 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.138591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.138613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.138629 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:50Z","lastTransitionTime":"2025-10-01T16:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.241279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.241575 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.241682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.241792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.241879 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:50Z","lastTransitionTime":"2025-10-01T16:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.344183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.344273 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.344290 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.344309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.344321 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:50Z","lastTransitionTime":"2025-10-01T16:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.446627 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.446669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.446680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.446692 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.446702 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:50Z","lastTransitionTime":"2025-10-01T16:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.549503 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.549802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.549898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.549994 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.550149 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:50Z","lastTransitionTime":"2025-10-01T16:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.653434 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.653483 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.653494 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.653511 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.653524 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:50Z","lastTransitionTime":"2025-10-01T16:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.756885 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.756939 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.756950 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.756966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.756975 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:50Z","lastTransitionTime":"2025-10-01T16:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.860540 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.860590 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.860606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.860621 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.860630 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:50Z","lastTransitionTime":"2025-10-01T16:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.965328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.965368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.965379 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.965396 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:50 crc kubenswrapper[4764]: I1001 16:03:50.965407 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:50Z","lastTransitionTime":"2025-10-01T16:03:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.068518 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.068956 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.069183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.069403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.069599 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:51Z","lastTransitionTime":"2025-10-01T16:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.172040 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.172103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.172115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.172131 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.172144 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:51Z","lastTransitionTime":"2025-10-01T16:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.274875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.274923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.274937 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.274955 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.274968 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:51Z","lastTransitionTime":"2025-10-01T16:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.378111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.378155 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.378165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.378180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.378193 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:51Z","lastTransitionTime":"2025-10-01T16:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.480879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.480951 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.480969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.480993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.481014 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:51Z","lastTransitionTime":"2025-10-01T16:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.584116 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.584214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.584227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.584246 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.584259 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:51Z","lastTransitionTime":"2025-10-01T16:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.687857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.687918 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.687929 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.687960 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.687971 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:51Z","lastTransitionTime":"2025-10-01T16:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.721544 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.721590 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.721705 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:51 crc kubenswrapper[4764]: E1001 16:03:51.721815 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.722039 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:51 crc kubenswrapper[4764]: E1001 16:03:51.722103 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:51 crc kubenswrapper[4764]: E1001 16:03:51.722217 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:51 crc kubenswrapper[4764]: E1001 16:03:51.722281 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.790014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.790074 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.790088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.790105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.790117 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:51Z","lastTransitionTime":"2025-10-01T16:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.893104 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.893142 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.893155 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.893174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.893186 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:51Z","lastTransitionTime":"2025-10-01T16:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.996691 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.997235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.997443 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.997626 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:51 crc kubenswrapper[4764]: I1001 16:03:51.997790 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:51Z","lastTransitionTime":"2025-10-01T16:03:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.100205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.100249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.100263 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.100278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.100291 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:52Z","lastTransitionTime":"2025-10-01T16:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.202545 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.202592 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.202605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.202622 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.202637 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:52Z","lastTransitionTime":"2025-10-01T16:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.305824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.305856 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.305866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.305883 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.305896 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:52Z","lastTransitionTime":"2025-10-01T16:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.408756 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.408803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.408814 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.408831 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.408842 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:52Z","lastTransitionTime":"2025-10-01T16:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.510925 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.510995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.511022 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.511087 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.511112 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:52Z","lastTransitionTime":"2025-10-01T16:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.613651 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.613699 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.613710 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.613725 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.613735 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:52Z","lastTransitionTime":"2025-10-01T16:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.716225 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.716264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.716273 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.716287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.716296 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:52Z","lastTransitionTime":"2025-10-01T16:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.818539 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.818580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.818591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.818608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.818619 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:52Z","lastTransitionTime":"2025-10-01T16:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.920951 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.921193 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.921305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.921402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:52 crc kubenswrapper[4764]: I1001 16:03:52.921486 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:52Z","lastTransitionTime":"2025-10-01T16:03:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.026987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.027035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.027070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.027094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.027110 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:53Z","lastTransitionTime":"2025-10-01T16:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.129908 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.130265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.130378 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.130475 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.130588 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:53Z","lastTransitionTime":"2025-10-01T16:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.232637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.232693 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.232702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.232715 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.232744 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:53Z","lastTransitionTime":"2025-10-01T16:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.334744 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.334793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.334814 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.334842 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.334862 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:53Z","lastTransitionTime":"2025-10-01T16:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.438371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.438416 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.438427 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.438443 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.438453 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:53Z","lastTransitionTime":"2025-10-01T16:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.540982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.541062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.541077 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.541097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.541110 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:53Z","lastTransitionTime":"2025-10-01T16:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.644384 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.644753 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.644790 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.644823 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.644838 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:53Z","lastTransitionTime":"2025-10-01T16:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.723752 4764 scope.go:117] "RemoveContainer" containerID="fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.724406 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.724375 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:53 crc kubenswrapper[4764]: E1001 16:03:53.724564 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:53 crc kubenswrapper[4764]: E1001 16:03:53.724606 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.724331 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.724432 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:53 crc kubenswrapper[4764]: E1001 16:03:53.724734 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:53 crc kubenswrapper[4764]: E1001 16:03:53.724937 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:53 crc kubenswrapper[4764]: E1001 16:03:53.725348 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.747371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.747677 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.747847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.748027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.748220 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:53Z","lastTransitionTime":"2025-10-01T16:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.851186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.851439 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.851525 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.851625 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.851719 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:53Z","lastTransitionTime":"2025-10-01T16:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.954556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.954620 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.954638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.954661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:53 crc kubenswrapper[4764]: I1001 16:03:53.954679 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:53Z","lastTransitionTime":"2025-10-01T16:03:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.057742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.057802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.057819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.057840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.057854 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:54Z","lastTransitionTime":"2025-10-01T16:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.159797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.159830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.159838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.159852 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.159860 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:54Z","lastTransitionTime":"2025-10-01T16:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.263233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.263276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.263285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.263299 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.263309 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:54Z","lastTransitionTime":"2025-10-01T16:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.366661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.367180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.367400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.367615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.367812 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:54Z","lastTransitionTime":"2025-10-01T16:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.471881 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.471947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.471961 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.471982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.471995 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:54Z","lastTransitionTime":"2025-10-01T16:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.575615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.575687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.575705 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.575734 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.575759 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:54Z","lastTransitionTime":"2025-10-01T16:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.678806 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.678885 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.678910 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.678942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.678964 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:54Z","lastTransitionTime":"2025-10-01T16:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.782467 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.782520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.782539 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.782558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.782569 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:54Z","lastTransitionTime":"2025-10-01T16:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.886221 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.886398 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.886424 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.886457 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.886478 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:54Z","lastTransitionTime":"2025-10-01T16:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.989501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.989751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.989838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.989967 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:54 crc kubenswrapper[4764]: I1001 16:03:54.990066 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:54Z","lastTransitionTime":"2025-10-01T16:03:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.092389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.092477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.092491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.092512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.092523 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.195480 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.195529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.195541 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.195556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.195567 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.275411 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.275459 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.275470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.275489 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.275501 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: E1001 16:03:55.289442 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.293499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.293540 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.293549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.293563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.293573 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: E1001 16:03:55.307588 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.312140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.312364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.312456 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.312551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.312638 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: E1001 16:03:55.325577 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.328906 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.328954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.328965 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.328981 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.328993 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: E1001 16:03:55.339671 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.342517 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.342695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.342839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.342969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.343078 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: E1001 16:03:55.353770 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:55Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:55 crc kubenswrapper[4764]: E1001 16:03:55.354146 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.355813 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.355840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.355869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.355883 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.355893 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.459233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.459383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.459408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.459474 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.459498 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.562463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.562502 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.562512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.562528 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.562538 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.664954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.664986 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.664995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.665008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.665017 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.721654 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.721756 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.721819 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:55 crc kubenswrapper[4764]: E1001 16:03:55.722595 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:55 crc kubenswrapper[4764]: E1001 16:03:55.722088 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:55 crc kubenswrapper[4764]: E1001 16:03:55.722199 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.721820 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:55 crc kubenswrapper[4764]: E1001 16:03:55.722882 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.767605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.767685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.767699 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.767720 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.767758 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.869929 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.869972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.869982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.869997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.870006 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.971902 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.971940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.971950 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.971965 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:55 crc kubenswrapper[4764]: I1001 16:03:55.971976 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:55Z","lastTransitionTime":"2025-10-01T16:03:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.074575 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.074794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.074855 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.074914 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.074978 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:56Z","lastTransitionTime":"2025-10-01T16:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.177433 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.177497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.177517 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.177541 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.177558 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:56Z","lastTransitionTime":"2025-10-01T16:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.286149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.286274 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.286298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.286323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.286405 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:56Z","lastTransitionTime":"2025-10-01T16:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.390816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.391094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.391169 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.391282 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.391362 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:56Z","lastTransitionTime":"2025-10-01T16:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.494090 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.494144 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.494156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.494175 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.494187 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:56Z","lastTransitionTime":"2025-10-01T16:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.596731 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.597078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.597176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.597251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.597314 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:56Z","lastTransitionTime":"2025-10-01T16:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.699358 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.699402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.699414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.699435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.699447 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:56Z","lastTransitionTime":"2025-10-01T16:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.801861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.801918 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.801930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.801946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.801957 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:56Z","lastTransitionTime":"2025-10-01T16:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.905469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.906173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.906212 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.906242 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:56 crc kubenswrapper[4764]: I1001 16:03:56.906266 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:56Z","lastTransitionTime":"2025-10-01T16:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.009120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.009175 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.009185 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.009202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.009213 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:57Z","lastTransitionTime":"2025-10-01T16:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.111829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.111880 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.111895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.111915 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.111930 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:57Z","lastTransitionTime":"2025-10-01T16:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.214909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.215183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.215303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.215391 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.215491 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:57Z","lastTransitionTime":"2025-10-01T16:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.318335 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.318378 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.318387 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.318404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.318414 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:57Z","lastTransitionTime":"2025-10-01T16:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.420377 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.420437 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.420459 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.420483 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.420511 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:57Z","lastTransitionTime":"2025-10-01T16:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.523266 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.523326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.523342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.523360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.523385 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:57Z","lastTransitionTime":"2025-10-01T16:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.625726 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.625763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.625774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.625788 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.625800 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:57Z","lastTransitionTime":"2025-10-01T16:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.721246 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.721299 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.721429 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.721437 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:57 crc kubenswrapper[4764]: E1001 16:03:57.721627 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:57 crc kubenswrapper[4764]: E1001 16:03:57.721731 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:57 crc kubenswrapper[4764]: E1001 16:03:57.721786 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:57 crc kubenswrapper[4764]: E1001 16:03:57.721874 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.728900 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.728933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.728942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.728954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.728964 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:57Z","lastTransitionTime":"2025-10-01T16:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.738462 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.752909 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.763954 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.786766 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.800399 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.816694 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.828538 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.831269 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.831325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.831338 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.831357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.831369 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:57Z","lastTransitionTime":"2025-10-01T16:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.842941 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.860377 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.872678 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.886461 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.904161 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.920866 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.933693 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.933739 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.933753 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.933774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.933790 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:57Z","lastTransitionTime":"2025-10-01T16:03:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.934101 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.947039 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.958503 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:30Z\\\",\\\"message\\\":\\\"2025-10-01T16:02:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533\\\\n2025-10-01T16:02:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533 to /host/opt/cni/bin/\\\\n2025-10-01T16:02:44Z [verbose] multus-daemon started\\\\n2025-10-01T16:02:44Z [verbose] Readiness Indicator file check\\\\n2025-10-01T16:03:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.980211 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:39Z\\\",\\\"message\\\":\\\"uring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1001 16:03:39.584179 6817 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1001 16:03:39.584183 6817 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1001 16:03:39.583527 6817 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z]\\\\nI1001 16:03:39.583998 6817 services_controller.go:451] Built service openshift-service-ca-oper\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:57 crc kubenswrapper[4764]: I1001 16:03:57.992229 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.001781 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0ec5e3-7361-47d0-8933-4b35d10037fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d67eb7f641077dbdc2785600e9a2efc1c3e75dcafa93923f6a0ccc9b577cad07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:57Z is after 2025-08-24T17:21:41Z" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.036017 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.036082 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.036099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.036114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.036124 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:58Z","lastTransitionTime":"2025-10-01T16:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.138400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.138442 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.138455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.138471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.138484 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:58Z","lastTransitionTime":"2025-10-01T16:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.240141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.240174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.240183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.240214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.240226 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:58Z","lastTransitionTime":"2025-10-01T16:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.342825 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.342862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.342871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.342884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.342895 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:58Z","lastTransitionTime":"2025-10-01T16:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.445921 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.445969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.445978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.445992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.446003 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:58Z","lastTransitionTime":"2025-10-01T16:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.548751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.548801 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.548854 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.548882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.548922 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:58Z","lastTransitionTime":"2025-10-01T16:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.652475 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.652529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.652541 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.652558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.652572 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:58Z","lastTransitionTime":"2025-10-01T16:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.754748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.754779 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.754787 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.754801 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.754810 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:58Z","lastTransitionTime":"2025-10-01T16:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.857901 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.857959 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.857977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.857997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.858013 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:58Z","lastTransitionTime":"2025-10-01T16:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.960538 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.961363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.961487 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.961522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:58 crc kubenswrapper[4764]: I1001 16:03:58.961540 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:58Z","lastTransitionTime":"2025-10-01T16:03:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.064207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.064280 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.064293 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.064308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.064318 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:59Z","lastTransitionTime":"2025-10-01T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.167005 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.167071 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.167086 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.167102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.167113 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:59Z","lastTransitionTime":"2025-10-01T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.269828 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.269871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.269884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.269899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.269909 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:59Z","lastTransitionTime":"2025-10-01T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.372272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.372328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.372339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.372351 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.372360 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:59Z","lastTransitionTime":"2025-10-01T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.474753 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.474816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.474829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.474845 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.474854 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:59Z","lastTransitionTime":"2025-10-01T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.577502 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.577529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.577536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.577548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.577557 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:59Z","lastTransitionTime":"2025-10-01T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.680315 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.680610 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.680736 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.680821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.680896 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:59Z","lastTransitionTime":"2025-10-01T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.721790 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.721855 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.721883 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.721822 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:03:59 crc kubenswrapper[4764]: E1001 16:03:59.722063 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:03:59 crc kubenswrapper[4764]: E1001 16:03:59.721956 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:03:59 crc kubenswrapper[4764]: E1001 16:03:59.722189 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:03:59 crc kubenswrapper[4764]: E1001 16:03:59.722301 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.784292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.784365 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.784379 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.784401 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.784415 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:59Z","lastTransitionTime":"2025-10-01T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.887114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.887183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.887200 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.887224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.887239 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:59Z","lastTransitionTime":"2025-10-01T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.990464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.990495 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.990503 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.990521 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:03:59 crc kubenswrapper[4764]: I1001 16:03:59.990530 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:03:59Z","lastTransitionTime":"2025-10-01T16:03:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.092659 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.092750 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.092768 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.092789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.092803 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:00Z","lastTransitionTime":"2025-10-01T16:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.195137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.195203 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.195218 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.195240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.195256 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:00Z","lastTransitionTime":"2025-10-01T16:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.297910 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.297969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.297987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.298010 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.298026 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:00Z","lastTransitionTime":"2025-10-01T16:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.400127 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.400166 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.400177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.400193 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.400203 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:00Z","lastTransitionTime":"2025-10-01T16:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.502719 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.502764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.502776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.502797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.502809 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:00Z","lastTransitionTime":"2025-10-01T16:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.542365 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:00 crc kubenswrapper[4764]: E1001 16:04:00.542556 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:04:00 crc kubenswrapper[4764]: E1001 16:04:00.542631 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs podName:41a0358d-ae10-4282-9423-8f3599adbc2a nodeName:}" failed. No retries permitted until 2025-10-01 16:05:04.542610291 +0000 UTC m=+167.542257136 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs") pod "network-metrics-daemon-btbfp" (UID: "41a0358d-ae10-4282-9423-8f3599adbc2a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.604775 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.604843 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.604861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.604880 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.604890 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:00Z","lastTransitionTime":"2025-10-01T16:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.707265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.707333 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.707349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.707366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.707377 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:00Z","lastTransitionTime":"2025-10-01T16:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.810635 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.810712 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.810735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.810766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.810789 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:00Z","lastTransitionTime":"2025-10-01T16:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.913205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.913243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.913253 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.913268 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:00 crc kubenswrapper[4764]: I1001 16:04:00.913278 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:00Z","lastTransitionTime":"2025-10-01T16:04:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.015376 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.015412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.015430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.015447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.015458 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:01Z","lastTransitionTime":"2025-10-01T16:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.118255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.118300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.118311 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.118327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.118338 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:01Z","lastTransitionTime":"2025-10-01T16:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.220555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.220588 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.220596 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.220609 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.220619 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:01Z","lastTransitionTime":"2025-10-01T16:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.322640 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.322698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.322710 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.322724 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.322734 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:01Z","lastTransitionTime":"2025-10-01T16:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.425039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.425108 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.425124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.425138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.425148 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:01Z","lastTransitionTime":"2025-10-01T16:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.528390 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.528469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.528481 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.528499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.528514 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:01Z","lastTransitionTime":"2025-10-01T16:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.632491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.632568 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.632592 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.632619 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.632637 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:01Z","lastTransitionTime":"2025-10-01T16:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.721159 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.721209 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.721357 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:01 crc kubenswrapper[4764]: E1001 16:04:01.721499 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.721547 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:01 crc kubenswrapper[4764]: E1001 16:04:01.721659 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:01 crc kubenswrapper[4764]: E1001 16:04:01.721747 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:01 crc kubenswrapper[4764]: E1001 16:04:01.721802 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.734983 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.735027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.735039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.735076 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.735089 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:01Z","lastTransitionTime":"2025-10-01T16:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.837913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.837959 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.837969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.837987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.837998 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:01Z","lastTransitionTime":"2025-10-01T16:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.940867 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.940913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.940924 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.940939 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:01 crc kubenswrapper[4764]: I1001 16:04:01.940947 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:01Z","lastTransitionTime":"2025-10-01T16:04:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.044003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.044092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.044129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.044163 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.044185 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:02Z","lastTransitionTime":"2025-10-01T16:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.147072 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.147130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.147139 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.147155 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.147165 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:02Z","lastTransitionTime":"2025-10-01T16:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.250216 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.250264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.250273 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.250288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.250297 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:02Z","lastTransitionTime":"2025-10-01T16:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.353743 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.353819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.353840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.353860 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.353879 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:02Z","lastTransitionTime":"2025-10-01T16:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.456409 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.456454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.456466 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.456482 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.456494 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:02Z","lastTransitionTime":"2025-10-01T16:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.559503 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.559573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.559626 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.559641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.559649 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:02Z","lastTransitionTime":"2025-10-01T16:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.662074 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.662127 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.662139 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.662155 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.662168 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:02Z","lastTransitionTime":"2025-10-01T16:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.764899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.764940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.764948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.764962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.764980 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:02Z","lastTransitionTime":"2025-10-01T16:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.867675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.867723 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.867735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.867751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.867761 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:02Z","lastTransitionTime":"2025-10-01T16:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.970000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.970040 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.970081 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.970098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:02 crc kubenswrapper[4764]: I1001 16:04:02.970110 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:02Z","lastTransitionTime":"2025-10-01T16:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.072812 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.072868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.072877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.072894 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.072903 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:03Z","lastTransitionTime":"2025-10-01T16:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.174977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.175019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.175030 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.175062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.175075 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:03Z","lastTransitionTime":"2025-10-01T16:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.277676 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.277721 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.277731 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.277748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.277757 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:03Z","lastTransitionTime":"2025-10-01T16:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.380240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.380299 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.380314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.380334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.380346 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:03Z","lastTransitionTime":"2025-10-01T16:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.482570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.482623 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.482637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.482654 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.482668 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:03Z","lastTransitionTime":"2025-10-01T16:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.585075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.585121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.585134 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.585152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.585164 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:03Z","lastTransitionTime":"2025-10-01T16:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.687959 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.688001 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.688012 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.688028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.688038 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:03Z","lastTransitionTime":"2025-10-01T16:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.721397 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.721487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.721421 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.721574 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:03 crc kubenswrapper[4764]: E1001 16:04:03.721572 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:03 crc kubenswrapper[4764]: E1001 16:04:03.721650 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:03 crc kubenswrapper[4764]: E1001 16:04:03.721819 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:03 crc kubenswrapper[4764]: E1001 16:04:03.721894 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.791032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.791168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.791190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.791226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.791263 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:03Z","lastTransitionTime":"2025-10-01T16:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.894231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.894378 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.894439 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.894469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.894492 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:03Z","lastTransitionTime":"2025-10-01T16:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.998029 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.998142 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.998156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.998174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:03 crc kubenswrapper[4764]: I1001 16:04:03.998186 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:03Z","lastTransitionTime":"2025-10-01T16:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.102146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.102248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.102335 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.102363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.102381 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:04Z","lastTransitionTime":"2025-10-01T16:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.205272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.205328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.205340 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.205360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.205370 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:04Z","lastTransitionTime":"2025-10-01T16:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.309235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.309291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.309299 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.309318 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.309336 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:04Z","lastTransitionTime":"2025-10-01T16:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.412415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.412476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.412488 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.412509 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.412523 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:04Z","lastTransitionTime":"2025-10-01T16:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.515581 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.515648 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.515665 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.515690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.515707 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:04Z","lastTransitionTime":"2025-10-01T16:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.617845 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.617888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.617900 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.617917 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.617964 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:04Z","lastTransitionTime":"2025-10-01T16:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.721239 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.721383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.721396 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.721412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.721422 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:04Z","lastTransitionTime":"2025-10-01T16:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.824201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.824261 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.824271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.824288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.824298 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:04Z","lastTransitionTime":"2025-10-01T16:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.926906 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.927008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.927018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.927033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:04 crc kubenswrapper[4764]: I1001 16:04:04.927061 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:04Z","lastTransitionTime":"2025-10-01T16:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.029822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.029869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.029880 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.029894 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.029904 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.132504 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.132542 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.132553 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.132601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.132612 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.234460 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.234499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.234510 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.234637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.234654 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.337428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.337465 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.337476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.337492 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.337504 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.440298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.440339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.440350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.440367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.440380 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.543492 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.543561 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.543584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.543610 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.543625 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.646119 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.646164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.646173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.646196 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.646214 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.678483 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.678524 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.678538 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.678555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.678568 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: E1001 16:04:05.690113 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:05Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.693740 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.693776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.693787 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.693802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.693813 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: E1001 16:04:05.705482 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:05Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.709254 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.709298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.709321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.709338 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.709349 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.720999 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.721129 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.721676 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.721756 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:05 crc kubenswrapper[4764]: E1001 16:04:05.721904 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.722004 4764 scope.go:117] "RemoveContainer" containerID="fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a" Oct 01 16:04:05 crc kubenswrapper[4764]: E1001 16:04:05.722184 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:05 crc kubenswrapper[4764]: E1001 16:04:05.722186 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" Oct 01 16:04:05 crc kubenswrapper[4764]: E1001 16:04:05.722293 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:05 crc kubenswrapper[4764]: E1001 16:04:05.722368 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:05 crc kubenswrapper[4764]: E1001 16:04:05.722573 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:05Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.726850 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.726884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.726892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.726905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.726916 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: E1001 16:04:05.738697 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:05Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.742967 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.743003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.743014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.743097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.743125 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: E1001 16:04:05.757373 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T16:04:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2a812319-9b55-40ee-9d8a-92eb5dff7a6a\\\",\\\"systemUUID\\\":\\\"5f30d9a2-b6a5-482f-9083-66d464270d1d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:05Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:05 crc kubenswrapper[4764]: E1001 16:04:05.757547 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.759169 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.759229 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.759241 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.759258 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.759271 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.862188 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.862248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.862261 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.862279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.862290 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.964956 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.964995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.965003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.965015 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:05 crc kubenswrapper[4764]: I1001 16:04:05.965024 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:05Z","lastTransitionTime":"2025-10-01T16:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.068013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.068072 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.068088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.068103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.068111 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:06Z","lastTransitionTime":"2025-10-01T16:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.171136 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.171208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.171234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.171260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.171276 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:06Z","lastTransitionTime":"2025-10-01T16:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.273975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.274035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.274072 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.274085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.274094 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:06Z","lastTransitionTime":"2025-10-01T16:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.376791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.377131 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.377153 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.377178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.377194 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:06Z","lastTransitionTime":"2025-10-01T16:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.479580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.479659 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.479672 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.479696 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.479711 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:06Z","lastTransitionTime":"2025-10-01T16:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.582525 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.582584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.582597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.582618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.582629 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:06Z","lastTransitionTime":"2025-10-01T16:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.685681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.685921 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.686021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.686121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.686187 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:06Z","lastTransitionTime":"2025-10-01T16:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.788748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.788788 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.788800 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.788817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.788828 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:06Z","lastTransitionTime":"2025-10-01T16:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.892309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.892341 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.892350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.892363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.892372 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:06Z","lastTransitionTime":"2025-10-01T16:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.994844 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.994919 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.994940 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.994968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:06 crc kubenswrapper[4764]: I1001 16:04:06.994990 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:06Z","lastTransitionTime":"2025-10-01T16:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.096996 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.097036 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.097072 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.097091 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.097101 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:07Z","lastTransitionTime":"2025-10-01T16:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.200179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.200256 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.200279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.200308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.200330 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:07Z","lastTransitionTime":"2025-10-01T16:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.303549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.303586 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.303596 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.303612 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.303623 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:07Z","lastTransitionTime":"2025-10-01T16:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.406897 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.406943 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.406954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.406972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.406983 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:07Z","lastTransitionTime":"2025-10-01T16:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.508811 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.508883 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.508906 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.508938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.508959 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:07Z","lastTransitionTime":"2025-10-01T16:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.612134 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.612196 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.612212 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.612236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.612252 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:07Z","lastTransitionTime":"2025-10-01T16:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.714100 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.714145 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.714156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.714175 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.714188 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:07Z","lastTransitionTime":"2025-10-01T16:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.721543 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.721570 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.721591 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:07 crc kubenswrapper[4764]: E1001 16:04:07.721649 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.721688 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:07 crc kubenswrapper[4764]: E1001 16:04:07.721725 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:07 crc kubenswrapper[4764]: E1001 16:04:07.721812 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:07 crc kubenswrapper[4764]: E1001 16:04:07.721880 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.737934 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90991a-c985-448f-b543-6f25e2fe2fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bcdcda34c76215558bd1f4063a20c09718febee98cd0be82e75b37b46dec0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://50b8c40af1bd04010c1ff3e7ed132dc2c9a61afe48d850670f033f63f45b97b3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://191113c41d979f123629a942e96cd74036c7fd3ee1f00bb6bfd35619e6e4aec1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2bc093f1c0c55952e2db53743e5a91a86316d57d5a4b037d6bcdca346a12c87\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://078b76cbf0e747a33f2ca5bd898288aa6a7fdd7ba8848293dcf03ec91ab1af32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1001 16:02:31.515850 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 16:02:31.516663 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1101976805/tls.crt::/tmp/serving-cert-1101976805/tls.key\\\\\\\"\\\\nI1001 16:02:37.022616 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 16:02:37.027110 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 16:02:37.027161 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 16:02:37.027208 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 16:02:37.027222 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 16:02:37.040599 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 16:02:37.040635 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040644 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 16:02:37.040652 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 16:02:37.040661 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 16:02:37.040669 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 16:02:37.040686 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 16:02:37.040957 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 16:02:37.045644 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e759c339cdd8004b0872ca918e60677a12b14601d1d0b5be87f6bc2358d0b301\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382bb1fb009363a056a6a9f03e9f82bdc9e4f2a6b70523306b3493d53daa9b81\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.750856 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bb66cf5a-e8d1-4800-9f22-0c84f0d50725\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fd22046c3b24df94d6897e21fbd00b464c36be9fc031f13e3c10ba7c0c1296d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21be5d4e8a34aedbde76c8f6b346ed7d54e0c31e4879ef6ed489b3aea5a6e074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48ec7802fd895e9b17f2d4820da106b7e7ed43551369c17a2dbd4e1826b3c02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b253ec2e3460ff16243693566fd9e1387f8b898f1e3ff5b15d2f89697bc03e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.763952 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.773333 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-2bzj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77d0b256-53a7-44ab-aee2-904dd15bfa80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8958d971be86861b9f62c454e98e57598982cf4db98032dc205c78bffc1eb12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-slx78\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-2bzj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.786432 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2068a381-c49b-41a4-bd0d-8c525f9b30d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01899b33707c0e312d46060156ff94e0a3707bd4fbf1b0aeb8509d23656e9458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x65tx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zf6qx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.798892 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-btbfp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41a0358d-ae10-4282-9423-8f3599adbc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r7qxb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:56Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-btbfp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.810875 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e0ec5e3-7361-47d0-8933-4b35d10037fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d67eb7f641077dbdc2785600e9a2efc1c3e75dcafa93923f6a0ccc9b577cad07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54fa367474dbcf9f5a63ea77ada52e396c912f01ba63ab4ab591d9ad8aa0197d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.817236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.817276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.817295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.817313 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.817326 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:07Z","lastTransitionTime":"2025-10-01T16:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.825721 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28d82567-2957-4e98-8fd9-604e231e87da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://439b2df900cd3d0437aeb5b9aeb76582d4a46e93f8ed315ca70d082d0ff67886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be9038a2c2ce5dc64794decd612eb3de7334b898095dc37eddb6913084df93d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e49de8326237539da6f78c6f0fa8e2f045dc7db1c8dca90a28b0ce79813b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95d1f2d4237027a12400f86120fc27b35c230c860cdafc39f7c4191c59f5700d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.838385 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91393aaef3d9858546fc39deae2f31b19dcfe1ef061e89c79f9bdbb2d84fe12b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.852854 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ks425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5499b593-79e4-408e-a32b-9e132d3a0de7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:30Z\\\",\\\"message\\\":\\\"2025-10-01T16:02:44+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533\\\\n2025-10-01T16:02:44+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_3364fdd0-8915-4cfa-b8c5-24d31979c533 to /host/opt/cni/bin/\\\\n2025-10-01T16:02:44Z [verbose] multus-daemon started\\\\n2025-10-01T16:02:44Z [verbose] Readiness Indicator file check\\\\n2025-10-01T16:03:29Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:03:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n8ktn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ks425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.871072 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T16:03:39Z\\\",\\\"message\\\":\\\"uring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI1001 16:03:39.584179 6817 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI1001 16:03:39.584183 6817 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1001 16:03:39.583527 6817 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:03:39Z is after 2025-08-24T17:21:41Z]\\\\nI1001 16:03:39.583998 6817 services_controller.go:451] Built service openshift-service-ca-oper\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T16:03:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5zbvx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fngxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.882817 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"321c50a8-5c97-4d27-9e2e-5ec64a57905a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://efc177ca23eec8a50eece265edb9536e65e853cc7084de0aa162e4b99ae4e78f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5faaeffea1c1cb397fc4ed61ad3a0d1142e0cdbc3d0a57ae71308beabdfd310\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rvzwr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-dt47g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.901302 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6affb73-a730-4956-8dcb-0e2d3a2f5d9b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e629b30dfc110d809317d996f9c3997e88472c20c95cf9e9da7db3a26850e83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da7dc03de1309f0750a605e5d2918ed668bbad9ff959ced29faddfe951ed8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2674be89b758b3623f887180edfe5fc96c75318ae59a086e388ac371a1e64c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff1662d2c60dd22284940dbacc02ba9167ec6dae94e1c395d6ee3841f01b1623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00fd0218cc5b041329935250a91765d8f268779e4bb0061c23f9b6dae10c3eb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1f6e4a43294119d996b8ec18ef6f123aa2f7b264fe972e7e737d402debcb04e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5b2be33e6b6daeced26ad08d4bd31fb7cfbf789623bc45b9f1e2245b997b8be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7107993d3e5d16503b84a7cc9d72efdc4ea02cccf1eed92889a894239b3b50b5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.913771 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9807ade5224cfc6dee59f49ffa19e92eff9abe0e896b71e5217160643ed1eed4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.919788 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.919829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.919852 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.919880 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.919898 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:07Z","lastTransitionTime":"2025-10-01T16:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.925957 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a068b01291b8e20152cdf59994f58592dd567546d38e538b41774d29da48e3ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://849ca14447d8f6e0a80d4da6a7b934e780951f3e3a51f7be5fde158b4f179925\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.937216 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vq8z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eff65edf-094d-4261-838e-7ae318e5c6fc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://422c89145ff73879346894451f4eb12272371821bebc68a59cda20dff3f86506\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pfz9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vq8z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.951757 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.966413 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:07 crc kubenswrapper[4764]: I1001 16:04:07.984511 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jssc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ff6bf25-73d6-4e89-b803-12502064e5f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T16:02:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1cbfab5c5e27a67ccc76cad98b750d029d07d44398aac2d530cfad75b96f425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T16:02:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c27dca1a0035225b71a71162c2b97a68db66256f03c7fa0631e453543749e6ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30d6cd47f9f11ffdb167b6e0e0365f8e2f78bb9363bccf1d1067bab220b017a3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc33c964fa637e0ec645ba045352498f38770eda88990d79a6eca246699011ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f225d09a7196e160c6f7b5216c9bdf2b192f6fb1be2e2fb1cd2ef07b5af1fb2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32496ca894b68a9805c0f254f3c7e6408a934153b61689b147b8db50222e4325\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d0607b8550cb981dbdcc113e43ce76d6e67118ad59139f9f35ee25dcc5f31e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T16:02:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T16:02:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9bdwp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T16:02:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jssc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T16:04:07Z is after 2025-08-24T17:21:41Z" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.022412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.022462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.022474 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.022490 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.022503 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:08Z","lastTransitionTime":"2025-10-01T16:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.124784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.124850 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.124862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.124882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.124898 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:08Z","lastTransitionTime":"2025-10-01T16:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.228554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.228606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.228617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.228634 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.228651 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:08Z","lastTransitionTime":"2025-10-01T16:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.332403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.332724 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.332817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.332906 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.332980 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:08Z","lastTransitionTime":"2025-10-01T16:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.436543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.436626 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.436663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.436694 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.436715 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:08Z","lastTransitionTime":"2025-10-01T16:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.539824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.539889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.539902 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.539925 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.539943 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:08Z","lastTransitionTime":"2025-10-01T16:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.643735 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.643794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.643807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.643830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.643850 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:08Z","lastTransitionTime":"2025-10-01T16:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.746948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.747019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.747084 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.747120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.747141 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:08Z","lastTransitionTime":"2025-10-01T16:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.850312 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.850371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.850381 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.850399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.850413 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:08Z","lastTransitionTime":"2025-10-01T16:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.952915 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.952968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.952982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.953002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:08 crc kubenswrapper[4764]: I1001 16:04:08.953017 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:08Z","lastTransitionTime":"2025-10-01T16:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.056504 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.056757 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.056820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.056899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.056955 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:09Z","lastTransitionTime":"2025-10-01T16:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.159171 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.159207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.159216 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.159231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.159240 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:09Z","lastTransitionTime":"2025-10-01T16:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.262695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.262754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.262767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.262786 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.262801 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:09Z","lastTransitionTime":"2025-10-01T16:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.365515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.365599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.365613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.365634 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.365646 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:09Z","lastTransitionTime":"2025-10-01T16:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.467298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.467331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.467339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.467353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.467363 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:09Z","lastTransitionTime":"2025-10-01T16:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.570378 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.570412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.570423 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.570439 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.570448 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:09Z","lastTransitionTime":"2025-10-01T16:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.672610 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.672961 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.673113 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.673236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.673327 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:09Z","lastTransitionTime":"2025-10-01T16:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.723634 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:09 crc kubenswrapper[4764]: E1001 16:04:09.724164 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.724029 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:09 crc kubenswrapper[4764]: E1001 16:04:09.724784 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.724007 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:09 crc kubenswrapper[4764]: E1001 16:04:09.725213 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.724062 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:09 crc kubenswrapper[4764]: E1001 16:04:09.725505 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.776472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.776521 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.776532 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.776547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.776556 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:09Z","lastTransitionTime":"2025-10-01T16:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.880159 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.880571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.880731 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.880910 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.881072 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:09Z","lastTransitionTime":"2025-10-01T16:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.984821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.985320 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.985488 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.985690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:09 crc kubenswrapper[4764]: I1001 16:04:09.985825 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:09Z","lastTransitionTime":"2025-10-01T16:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.088717 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.088769 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.088781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.088798 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.088811 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:10Z","lastTransitionTime":"2025-10-01T16:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.191942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.191986 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.191997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.192014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.192025 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:10Z","lastTransitionTime":"2025-10-01T16:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.295299 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.295370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.295393 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.295423 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.295445 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:10Z","lastTransitionTime":"2025-10-01T16:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.398133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.398189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.398204 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.398227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.398243 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:10Z","lastTransitionTime":"2025-10-01T16:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.501126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.501164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.501177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.501192 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.501204 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:10Z","lastTransitionTime":"2025-10-01T16:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.603932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.604007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.604025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.604082 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.604100 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:10Z","lastTransitionTime":"2025-10-01T16:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.706928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.706968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.706979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.706998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.707010 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:10Z","lastTransitionTime":"2025-10-01T16:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.809762 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.809798 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.809808 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.809823 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.809833 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:10Z","lastTransitionTime":"2025-10-01T16:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.912460 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.912544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.912555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.912579 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:10 crc kubenswrapper[4764]: I1001 16:04:10.912593 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:10Z","lastTransitionTime":"2025-10-01T16:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.015111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.015169 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.015186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.015211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.015227 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:11Z","lastTransitionTime":"2025-10-01T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.117789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.117843 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.117857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.117878 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.117894 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:11Z","lastTransitionTime":"2025-10-01T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.220751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.220807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.220824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.220848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.220867 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:11Z","lastTransitionTime":"2025-10-01T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.323308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.323348 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.323359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.323374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.323384 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:11Z","lastTransitionTime":"2025-10-01T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.426106 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.426161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.426173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.426191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.426202 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:11Z","lastTransitionTime":"2025-10-01T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.528696 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.528751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.528764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.528784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.528798 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:11Z","lastTransitionTime":"2025-10-01T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.630779 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.630825 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.630840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.630856 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.630868 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:11Z","lastTransitionTime":"2025-10-01T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.721478 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.721571 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.721640 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:11 crc kubenswrapper[4764]: E1001 16:04:11.721852 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.721900 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:11 crc kubenswrapper[4764]: E1001 16:04:11.722106 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:11 crc kubenswrapper[4764]: E1001 16:04:11.722037 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:11 crc kubenswrapper[4764]: E1001 16:04:11.722286 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.732802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.732846 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.732859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.732875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.732887 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:11Z","lastTransitionTime":"2025-10-01T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.835553 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.835589 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.835601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.835641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.835671 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:11Z","lastTransitionTime":"2025-10-01T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.938550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.938580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.938590 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.938602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:11 crc kubenswrapper[4764]: I1001 16:04:11.938612 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:11Z","lastTransitionTime":"2025-10-01T16:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.042167 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.042210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.042219 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.042234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.042243 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:12Z","lastTransitionTime":"2025-10-01T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.144251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.144287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.144295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.144308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.144317 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:12Z","lastTransitionTime":"2025-10-01T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.247356 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.247411 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.247429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.247453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.247469 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:12Z","lastTransitionTime":"2025-10-01T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.350506 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.350695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.350706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.350718 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.350727 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:12Z","lastTransitionTime":"2025-10-01T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.452717 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.453169 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.453386 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.453537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.453684 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:12Z","lastTransitionTime":"2025-10-01T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.557108 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.557414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.557535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.557639 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.557719 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:12Z","lastTransitionTime":"2025-10-01T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.660291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.660517 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.660615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.660708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.660792 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:12Z","lastTransitionTime":"2025-10-01T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.763243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.763670 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.764003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.764397 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.764751 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:12Z","lastTransitionTime":"2025-10-01T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.870354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.870398 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.870408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.870431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.870442 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:12Z","lastTransitionTime":"2025-10-01T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.973009 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.973361 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.973499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.973650 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:12 crc kubenswrapper[4764]: I1001 16:04:12.973793 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:12Z","lastTransitionTime":"2025-10-01T16:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.076547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.076869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.076944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.077094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.077196 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:13Z","lastTransitionTime":"2025-10-01T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.179362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.179871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.179994 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.180128 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.180219 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:13Z","lastTransitionTime":"2025-10-01T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.282813 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.282851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.282862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.282875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.282884 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:13Z","lastTransitionTime":"2025-10-01T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.385888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.385951 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.385974 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.386003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.386025 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:13Z","lastTransitionTime":"2025-10-01T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.488515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.488807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.488895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.488975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.489082 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:13Z","lastTransitionTime":"2025-10-01T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.592187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.592267 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.592289 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.592313 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.592331 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:13Z","lastTransitionTime":"2025-10-01T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.695632 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.695725 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.695742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.695958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.695974 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:13Z","lastTransitionTime":"2025-10-01T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.721759 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.721763 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.721818 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.721848 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:13 crc kubenswrapper[4764]: E1001 16:04:13.722462 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:13 crc kubenswrapper[4764]: E1001 16:04:13.722586 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:13 crc kubenswrapper[4764]: E1001 16:04:13.722726 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:13 crc kubenswrapper[4764]: E1001 16:04:13.722803 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.799728 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.799818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.799871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.799898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.799914 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:13Z","lastTransitionTime":"2025-10-01T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.902223 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.902291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.902313 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.902341 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:13 crc kubenswrapper[4764]: I1001 16:04:13.902362 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:13Z","lastTransitionTime":"2025-10-01T16:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.005424 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.005475 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.005487 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.005503 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.005514 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:14Z","lastTransitionTime":"2025-10-01T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.107862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.108237 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.108461 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.108555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.108642 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:14Z","lastTransitionTime":"2025-10-01T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.211409 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.211481 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.211502 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.211530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.211554 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:14Z","lastTransitionTime":"2025-10-01T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.315338 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.315383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.315399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.315421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.315438 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:14Z","lastTransitionTime":"2025-10-01T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.418312 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.418931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.419081 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.419173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.419303 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:14Z","lastTransitionTime":"2025-10-01T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.522500 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.522561 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.522574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.522596 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.522609 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:14Z","lastTransitionTime":"2025-10-01T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.625690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.625755 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.625774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.625800 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.625823 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:14Z","lastTransitionTime":"2025-10-01T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.729107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.729176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.729201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.729234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.729258 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:14Z","lastTransitionTime":"2025-10-01T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.832300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.832337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.832348 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.832385 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.832397 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:14Z","lastTransitionTime":"2025-10-01T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.935801 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.935903 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.935931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.935958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:14 crc kubenswrapper[4764]: I1001 16:04:14.935979 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:14Z","lastTransitionTime":"2025-10-01T16:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.038930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.039405 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.039664 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.039872 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.040104 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:15Z","lastTransitionTime":"2025-10-01T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.143428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.143468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.143479 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.143495 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.143506 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:15Z","lastTransitionTime":"2025-10-01T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.246219 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.246259 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.246278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.246295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.246306 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:15Z","lastTransitionTime":"2025-10-01T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.348704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.348763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.348772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.348785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.348794 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:15Z","lastTransitionTime":"2025-10-01T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.451132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.451182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.451194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.451211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.451223 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:15Z","lastTransitionTime":"2025-10-01T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.554314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.554357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.554371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.554403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.554417 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:15Z","lastTransitionTime":"2025-10-01T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.657768 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.657859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.657876 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.657931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.657951 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:15Z","lastTransitionTime":"2025-10-01T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.721818 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.721884 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.721882 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.721825 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:15 crc kubenswrapper[4764]: E1001 16:04:15.722036 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:15 crc kubenswrapper[4764]: E1001 16:04:15.722283 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:15 crc kubenswrapper[4764]: E1001 16:04:15.722350 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:15 crc kubenswrapper[4764]: E1001 16:04:15.722472 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.761227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.761311 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.761334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.761360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.761377 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:15Z","lastTransitionTime":"2025-10-01T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.865272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.865345 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.865368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.865398 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.865419 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:15Z","lastTransitionTime":"2025-10-01T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.967922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.968324 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.968516 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.968670 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:15 crc kubenswrapper[4764]: I1001 16:04:15.968795 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:15Z","lastTransitionTime":"2025-10-01T16:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.072688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.072774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.072803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.072835 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.072857 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:16Z","lastTransitionTime":"2025-10-01T16:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.147571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.147960 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.148197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.148398 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.148557 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T16:04:16Z","lastTransitionTime":"2025-10-01T16:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.212567 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95"] Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.213205 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.215716 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.215932 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.216112 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.221352 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.273704 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks425_5499b593-79e4-408e-a32b-9e132d3a0de7/kube-multus/1.log" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.274612 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks425_5499b593-79e4-408e-a32b-9e132d3a0de7/kube-multus/0.log" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.274742 4764 generic.go:334] "Generic (PLEG): container finished" podID="5499b593-79e4-408e-a32b-9e132d3a0de7" containerID="a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d" exitCode=1 Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.274842 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks425" event={"ID":"5499b593-79e4-408e-a32b-9e132d3a0de7","Type":"ContainerDied","Data":"a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d"} Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.274935 4764 scope.go:117] "RemoveContainer" containerID="c4cc8f8b2607d59b615f206a9c492a272b1153a6a53f3853172ea7d4702cf1f2" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.275459 4764 scope.go:117] "RemoveContainer" containerID="a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d" Oct 01 16:04:16 crc kubenswrapper[4764]: E1001 16:04:16.275739 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-ks425_openshift-multus(5499b593-79e4-408e-a32b-9e132d3a0de7)\"" pod="openshift-multus/multus-ks425" podUID="5499b593-79e4-408e-a32b-9e132d3a0de7" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.289520 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=99.289495493 podStartE2EDuration="1m39.289495493s" podCreationTimestamp="2025-10-01 16:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:16.289339769 +0000 UTC m=+119.288986644" watchObservedRunningTime="2025-10-01 16:04:16.289495493 +0000 UTC m=+119.289142338" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.289943 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jssc8" podStartSLOduration=94.289936014 podStartE2EDuration="1m34.289936014s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:16.260790489 +0000 UTC m=+119.260437334" watchObservedRunningTime="2025-10-01 16:04:16.289936014 +0000 UTC m=+119.289582859" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.307479 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=99.307456604 podStartE2EDuration="1m39.307456604s" podCreationTimestamp="2025-10-01 16:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:16.306951252 +0000 UTC m=+119.306598097" watchObservedRunningTime="2025-10-01 16:04:16.307456604 +0000 UTC m=+119.307103449" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.309187 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/30ca0834-53fb-48da-9d09-683a2f922aa0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.309227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30ca0834-53fb-48da-9d09-683a2f922aa0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.309256 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30ca0834-53fb-48da-9d09-683a2f922aa0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.309279 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/30ca0834-53fb-48da-9d09-683a2f922aa0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.309349 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30ca0834-53fb-48da-9d09-683a2f922aa0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.332534 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2bzj9" podStartSLOduration=94.332508718 podStartE2EDuration="1m34.332508718s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:16.332141669 +0000 UTC m=+119.331788524" watchObservedRunningTime="2025-10-01 16:04:16.332508718 +0000 UTC m=+119.332155573" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.345699 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podStartSLOduration=94.345680651 podStartE2EDuration="1m34.345680651s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:16.345368303 +0000 UTC m=+119.345015168" watchObservedRunningTime="2025-10-01 16:04:16.345680651 +0000 UTC m=+119.345327506" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.392542 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=67.39251998 podStartE2EDuration="1m7.39251998s" podCreationTimestamp="2025-10-01 16:03:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:16.392468139 +0000 UTC m=+119.392114984" watchObservedRunningTime="2025-10-01 16:04:16.39251998 +0000 UTC m=+119.392166815" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.392837 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=39.392831778 podStartE2EDuration="39.392831778s" podCreationTimestamp="2025-10-01 16:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:16.378296582 +0000 UTC m=+119.377943467" watchObservedRunningTime="2025-10-01 16:04:16.392831778 +0000 UTC m=+119.392478613" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.410221 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30ca0834-53fb-48da-9d09-683a2f922aa0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.410771 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/30ca0834-53fb-48da-9d09-683a2f922aa0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.410931 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/30ca0834-53fb-48da-9d09-683a2f922aa0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.411190 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30ca0834-53fb-48da-9d09-683a2f922aa0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.411359 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30ca0834-53fb-48da-9d09-683a2f922aa0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.411513 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/30ca0834-53fb-48da-9d09-683a2f922aa0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.411669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/30ca0834-53fb-48da-9d09-683a2f922aa0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.411947 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/30ca0834-53fb-48da-9d09-683a2f922aa0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.416928 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30ca0834-53fb-48da-9d09-683a2f922aa0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.421432 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ks425" podStartSLOduration=94.421416559 podStartE2EDuration="1m34.421416559s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:16.420833995 +0000 UTC m=+119.420480830" watchObservedRunningTime="2025-10-01 16:04:16.421416559 +0000 UTC m=+119.421063394" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.430436 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/30ca0834-53fb-48da-9d09-683a2f922aa0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b6p95\" (UID: \"30ca0834-53fb-48da-9d09-683a2f922aa0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.454803 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-dt47g" podStartSLOduration=94.454782667 podStartE2EDuration="1m34.454782667s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:16.454768497 +0000 UTC m=+119.454415332" watchObservedRunningTime="2025-10-01 16:04:16.454782667 +0000 UTC m=+119.454429502" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.493480 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=95.493462326 podStartE2EDuration="1m35.493462326s" podCreationTimestamp="2025-10-01 16:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:16.480625521 +0000 UTC m=+119.480272366" watchObservedRunningTime="2025-10-01 16:04:16.493462326 +0000 UTC m=+119.493109161" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.538952 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" Oct 01 16:04:16 crc kubenswrapper[4764]: I1001 16:04:16.545557 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vq8z5" podStartSLOduration=94.545542293 podStartE2EDuration="1m34.545542293s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:16.544945279 +0000 UTC m=+119.544592114" watchObservedRunningTime="2025-10-01 16:04:16.545542293 +0000 UTC m=+119.545189128" Oct 01 16:04:17 crc kubenswrapper[4764]: I1001 16:04:17.282460 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" event={"ID":"30ca0834-53fb-48da-9d09-683a2f922aa0","Type":"ContainerStarted","Data":"ead7c48919c64caa110be9a2ca1924844ccd784e178572f0e0779567d5991c99"} Oct 01 16:04:17 crc kubenswrapper[4764]: I1001 16:04:17.282505 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" event={"ID":"30ca0834-53fb-48da-9d09-683a2f922aa0","Type":"ContainerStarted","Data":"bfd5117669dd48a8456a5a959eb9d9bde7c3a40b0b20aebafc6bb37740a71857"} Oct 01 16:04:17 crc kubenswrapper[4764]: I1001 16:04:17.284457 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks425_5499b593-79e4-408e-a32b-9e132d3a0de7/kube-multus/1.log" Oct 01 16:04:17 crc kubenswrapper[4764]: I1001 16:04:17.721009 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:17 crc kubenswrapper[4764]: I1001 16:04:17.721091 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:17 crc kubenswrapper[4764]: I1001 16:04:17.721143 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:17 crc kubenswrapper[4764]: E1001 16:04:17.722619 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:17 crc kubenswrapper[4764]: I1001 16:04:17.722684 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:17 crc kubenswrapper[4764]: E1001 16:04:17.722800 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:17 crc kubenswrapper[4764]: E1001 16:04:17.722890 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:17 crc kubenswrapper[4764]: E1001 16:04:17.723323 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:17 crc kubenswrapper[4764]: I1001 16:04:17.723607 4764 scope.go:117] "RemoveContainer" containerID="fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a" Oct 01 16:04:17 crc kubenswrapper[4764]: E1001 16:04:17.723792 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fngxf_openshift-ovn-kubernetes(fe0fc1af-28a8-48cd-ba84-954c8e7de3e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" Oct 01 16:04:17 crc kubenswrapper[4764]: E1001 16:04:17.725527 4764 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 01 16:04:17 crc kubenswrapper[4764]: E1001 16:04:17.843188 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 16:04:19 crc kubenswrapper[4764]: I1001 16:04:19.721544 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:19 crc kubenswrapper[4764]: I1001 16:04:19.721608 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:19 crc kubenswrapper[4764]: I1001 16:04:19.721625 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:19 crc kubenswrapper[4764]: E1001 16:04:19.721943 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:19 crc kubenswrapper[4764]: I1001 16:04:19.721637 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:19 crc kubenswrapper[4764]: E1001 16:04:19.721882 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:19 crc kubenswrapper[4764]: E1001 16:04:19.721754 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:19 crc kubenswrapper[4764]: E1001 16:04:19.722219 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:21 crc kubenswrapper[4764]: I1001 16:04:21.721407 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:21 crc kubenswrapper[4764]: I1001 16:04:21.721453 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:21 crc kubenswrapper[4764]: I1001 16:04:21.721491 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:21 crc kubenswrapper[4764]: I1001 16:04:21.721431 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:21 crc kubenswrapper[4764]: E1001 16:04:21.721568 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:21 crc kubenswrapper[4764]: E1001 16:04:21.721668 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:21 crc kubenswrapper[4764]: E1001 16:04:21.721719 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:21 crc kubenswrapper[4764]: E1001 16:04:21.721745 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:22 crc kubenswrapper[4764]: E1001 16:04:22.844734 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 16:04:23 crc kubenswrapper[4764]: I1001 16:04:23.720891 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:23 crc kubenswrapper[4764]: I1001 16:04:23.720979 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:23 crc kubenswrapper[4764]: I1001 16:04:23.720912 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:23 crc kubenswrapper[4764]: I1001 16:04:23.720891 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:23 crc kubenswrapper[4764]: E1001 16:04:23.721036 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:23 crc kubenswrapper[4764]: E1001 16:04:23.721160 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:23 crc kubenswrapper[4764]: E1001 16:04:23.721257 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:23 crc kubenswrapper[4764]: E1001 16:04:23.721334 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:25 crc kubenswrapper[4764]: I1001 16:04:25.721463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:25 crc kubenswrapper[4764]: E1001 16:04:25.721990 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:25 crc kubenswrapper[4764]: I1001 16:04:25.722378 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:25 crc kubenswrapper[4764]: E1001 16:04:25.722492 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:25 crc kubenswrapper[4764]: I1001 16:04:25.722685 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:25 crc kubenswrapper[4764]: E1001 16:04:25.722769 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:25 crc kubenswrapper[4764]: I1001 16:04:25.722956 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:25 crc kubenswrapper[4764]: E1001 16:04:25.723035 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:27 crc kubenswrapper[4764]: I1001 16:04:27.721342 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:27 crc kubenswrapper[4764]: I1001 16:04:27.721408 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:27 crc kubenswrapper[4764]: E1001 16:04:27.722709 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:27 crc kubenswrapper[4764]: I1001 16:04:27.722769 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:27 crc kubenswrapper[4764]: I1001 16:04:27.722803 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:27 crc kubenswrapper[4764]: E1001 16:04:27.722884 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:27 crc kubenswrapper[4764]: E1001 16:04:27.723450 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:27 crc kubenswrapper[4764]: E1001 16:04:27.723482 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:27 crc kubenswrapper[4764]: E1001 16:04:27.845538 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 16:04:29 crc kubenswrapper[4764]: I1001 16:04:29.721153 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:29 crc kubenswrapper[4764]: I1001 16:04:29.721220 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:29 crc kubenswrapper[4764]: I1001 16:04:29.721247 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:29 crc kubenswrapper[4764]: I1001 16:04:29.721251 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:29 crc kubenswrapper[4764]: E1001 16:04:29.721382 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:29 crc kubenswrapper[4764]: E1001 16:04:29.721486 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:29 crc kubenswrapper[4764]: E1001 16:04:29.721741 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:29 crc kubenswrapper[4764]: E1001 16:04:29.721841 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:29 crc kubenswrapper[4764]: I1001 16:04:29.721896 4764 scope.go:117] "RemoveContainer" containerID="a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d" Oct 01 16:04:29 crc kubenswrapper[4764]: I1001 16:04:29.740747 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b6p95" podStartSLOduration=107.740727776 podStartE2EDuration="1m47.740727776s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:17.297980359 +0000 UTC m=+120.297627194" watchObservedRunningTime="2025-10-01 16:04:29.740727776 +0000 UTC m=+132.740374621" Oct 01 16:04:30 crc kubenswrapper[4764]: I1001 16:04:30.328318 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks425_5499b593-79e4-408e-a32b-9e132d3a0de7/kube-multus/1.log" Oct 01 16:04:30 crc kubenswrapper[4764]: I1001 16:04:30.328685 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks425" event={"ID":"5499b593-79e4-408e-a32b-9e132d3a0de7","Type":"ContainerStarted","Data":"ebe948ccdf109b264c30c2e6b27c52173e08727669e0354529418595261bf85b"} Oct 01 16:04:31 crc kubenswrapper[4764]: I1001 16:04:31.721262 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:31 crc kubenswrapper[4764]: I1001 16:04:31.721290 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:31 crc kubenswrapper[4764]: I1001 16:04:31.721275 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:31 crc kubenswrapper[4764]: E1001 16:04:31.721394 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:31 crc kubenswrapper[4764]: E1001 16:04:31.721454 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:31 crc kubenswrapper[4764]: I1001 16:04:31.721509 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:31 crc kubenswrapper[4764]: E1001 16:04:31.721659 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:31 crc kubenswrapper[4764]: E1001 16:04:31.721527 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:32 crc kubenswrapper[4764]: I1001 16:04:32.722147 4764 scope.go:117] "RemoveContainer" containerID="fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a" Oct 01 16:04:32 crc kubenswrapper[4764]: E1001 16:04:32.847149 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 16:04:33 crc kubenswrapper[4764]: I1001 16:04:33.343267 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/3.log" Oct 01 16:04:33 crc kubenswrapper[4764]: I1001 16:04:33.346603 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerStarted","Data":"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d"} Oct 01 16:04:33 crc kubenswrapper[4764]: I1001 16:04:33.347165 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:04:33 crc kubenswrapper[4764]: I1001 16:04:33.380733 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podStartSLOduration=111.380710713 podStartE2EDuration="1m51.380710713s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:33.379842982 +0000 UTC m=+136.379489827" watchObservedRunningTime="2025-10-01 16:04:33.380710713 +0000 UTC m=+136.380357578" Oct 01 16:04:33 crc kubenswrapper[4764]: I1001 16:04:33.647251 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-btbfp"] Oct 01 16:04:33 crc kubenswrapper[4764]: I1001 16:04:33.647603 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:33 crc kubenswrapper[4764]: E1001 16:04:33.647699 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:33 crc kubenswrapper[4764]: I1001 16:04:33.721808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:33 crc kubenswrapper[4764]: E1001 16:04:33.721927 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:33 crc kubenswrapper[4764]: I1001 16:04:33.721987 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:33 crc kubenswrapper[4764]: E1001 16:04:33.722084 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:33 crc kubenswrapper[4764]: I1001 16:04:33.722133 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:33 crc kubenswrapper[4764]: E1001 16:04:33.722189 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:35 crc kubenswrapper[4764]: I1001 16:04:35.721309 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:35 crc kubenswrapper[4764]: I1001 16:04:35.721346 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:35 crc kubenswrapper[4764]: E1001 16:04:35.721449 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:35 crc kubenswrapper[4764]: I1001 16:04:35.721508 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:35 crc kubenswrapper[4764]: E1001 16:04:35.721697 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:35 crc kubenswrapper[4764]: I1001 16:04:35.721969 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:35 crc kubenswrapper[4764]: E1001 16:04:35.722092 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:35 crc kubenswrapper[4764]: E1001 16:04:35.722296 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:37 crc kubenswrapper[4764]: I1001 16:04:37.720814 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:37 crc kubenswrapper[4764]: I1001 16:04:37.720813 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:37 crc kubenswrapper[4764]: I1001 16:04:37.720904 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:37 crc kubenswrapper[4764]: E1001 16:04:37.721784 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 16:04:37 crc kubenswrapper[4764]: I1001 16:04:37.721798 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:37 crc kubenswrapper[4764]: E1001 16:04:37.721922 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 16:04:37 crc kubenswrapper[4764]: E1001 16:04:37.722010 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-btbfp" podUID="41a0358d-ae10-4282-9423-8f3599adbc2a" Oct 01 16:04:37 crc kubenswrapper[4764]: E1001 16:04:37.722117 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 16:04:39 crc kubenswrapper[4764]: I1001 16:04:39.721956 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:39 crc kubenswrapper[4764]: I1001 16:04:39.722038 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:39 crc kubenswrapper[4764]: I1001 16:04:39.722033 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:39 crc kubenswrapper[4764]: I1001 16:04:39.721960 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:04:39 crc kubenswrapper[4764]: I1001 16:04:39.726751 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 01 16:04:39 crc kubenswrapper[4764]: I1001 16:04:39.728882 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 01 16:04:39 crc kubenswrapper[4764]: I1001 16:04:39.729217 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 01 16:04:39 crc kubenswrapper[4764]: I1001 16:04:39.729500 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 01 16:04:39 crc kubenswrapper[4764]: I1001 16:04:39.729583 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 01 16:04:39 crc kubenswrapper[4764]: I1001 16:04:39.729908 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 01 16:04:45 crc kubenswrapper[4764]: I1001 16:04:45.536682 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:45 crc kubenswrapper[4764]: E1001 16:04:45.536917 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:06:47.536876543 +0000 UTC m=+270.536523408 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:45 crc kubenswrapper[4764]: I1001 16:04:45.537491 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:45 crc kubenswrapper[4764]: I1001 16:04:45.537625 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:45 crc kubenswrapper[4764]: I1001 16:04:45.539310 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:45 crc kubenswrapper[4764]: I1001 16:04:45.547004 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:45 crc kubenswrapper[4764]: I1001 16:04:45.638856 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:45 crc kubenswrapper[4764]: I1001 16:04:45.638912 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:45 crc kubenswrapper[4764]: I1001 16:04:45.642952 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:45 crc kubenswrapper[4764]: I1001 16:04:45.643982 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:45 crc kubenswrapper[4764]: I1001 16:04:45.748207 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 16:04:45 crc kubenswrapper[4764]: I1001 16:04:45.757670 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:45 crc kubenswrapper[4764]: I1001 16:04:45.767934 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 16:04:46 crc kubenswrapper[4764]: W1001 16:04:46.199087 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-96520164b480977bb8c825f302f2e2780b7f582551154055fa4cd3df5319532b WatchSource:0}: Error finding container 96520164b480977bb8c825f302f2e2780b7f582551154055fa4cd3df5319532b: Status 404 returned error can't find the container with id 96520164b480977bb8c825f302f2e2780b7f582551154055fa4cd3df5319532b Oct 01 16:04:46 crc kubenswrapper[4764]: W1001 16:04:46.199622 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-3463f76bdbec7469a03d191acff77420b1a64b06b272d66c77cffc37d06aa784 WatchSource:0}: Error finding container 3463f76bdbec7469a03d191acff77420b1a64b06b272d66c77cffc37d06aa784: Status 404 returned error can't find the container with id 3463f76bdbec7469a03d191acff77420b1a64b06b272d66c77cffc37d06aa784 Oct 01 16:04:46 crc kubenswrapper[4764]: I1001 16:04:46.394278 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c1ec894031ab91053f693bfe9c6e88a8157b2a7aa2567541509065206973459a"} Oct 01 16:04:46 crc kubenswrapper[4764]: I1001 16:04:46.394359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e3b93310f0591a7ce46a0ea16dad1240b7594c63fd10fed89a567797d76c14a3"} Oct 01 16:04:46 crc kubenswrapper[4764]: I1001 16:04:46.396015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"16c30fdab6c661bc6fc80231c23f4e415487195ce2eccba666c018f43cdbb0aa"} Oct 01 16:04:46 crc kubenswrapper[4764]: I1001 16:04:46.396066 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"96520164b480977bb8c825f302f2e2780b7f582551154055fa4cd3df5319532b"} Oct 01 16:04:46 crc kubenswrapper[4764]: I1001 16:04:46.399765 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"eb64d89149064336c90f3254f0d52c91bd09aea2cea6dbf07ccbbd527f53781f"} Oct 01 16:04:46 crc kubenswrapper[4764]: I1001 16:04:46.399821 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3463f76bdbec7469a03d191acff77420b1a64b06b272d66c77cffc37d06aa784"} Oct 01 16:04:46 crc kubenswrapper[4764]: I1001 16:04:46.400454 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.019996 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.063858 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.064858 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.065719 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.066115 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.067517 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lt4t4"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.068169 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.068592 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.069258 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-5trjp"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.069550 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5trjp" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.070880 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.071097 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6j7l4"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.071232 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.071385 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.072454 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.072778 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.072909 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6bg4z"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.073553 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.074486 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.074885 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.081346 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.081498 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.082066 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.082196 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.082351 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.084939 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-ph942"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.085493 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.085628 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.086265 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wc7z5"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.086473 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.086902 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.087579 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.087937 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.088132 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.088500 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-927gg"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.088972 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.095638 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.096583 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.098730 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.099156 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.102431 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hn4rz"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.103167 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.103773 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.104429 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.104606 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.104723 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.105009 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.105171 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.105370 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.105741 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.105944 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.106199 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.106393 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.106547 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.106719 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.107449 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.107680 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.108014 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.108214 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.108375 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.108649 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.108801 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.108962 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.109134 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.111371 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.111619 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.111721 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.111771 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.111929 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-pfzm8"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.111948 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.111986 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.112330 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.112499 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.112555 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.112504 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.112742 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.115516 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.115851 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.127020 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.116014 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.116100 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.116155 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.116197 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.116237 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.127595 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.126529 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.126816 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.126846 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.127337 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.127834 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.128143 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.128340 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.128410 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.128764 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.130218 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.130443 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.131109 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.131985 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m4gx9"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.132870 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.145739 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.145752 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.145893 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.146032 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.146155 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.146307 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.146368 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.146469 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.146650 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.146788 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.151524 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.151940 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.152122 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.152250 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.153860 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.153942 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1d95364-40e4-46a6-a2de-3a94a8cda31e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qp66p\" (UID: \"a1d95364-40e4-46a6-a2de-3a94a8cda31e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154201 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e81f7ca-2bc8-4d14-a101-e73361300228-config\") pod \"machine-api-operator-5694c8668f-lt4t4\" (UID: \"1e81f7ca-2bc8-4d14-a101-e73361300228\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154226 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbnk\" (UniqueName: \"kubernetes.io/projected/a1d95364-40e4-46a6-a2de-3a94a8cda31e-kube-api-access-fzbnk\") pod \"cluster-samples-operator-665b6dd947-qp66p\" (UID: \"a1d95364-40e4-46a6-a2de-3a94a8cda31e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154248 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-service-ca\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154270 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54bc06f9-fb4d-496c-af11-0d4acf39f27b-serving-cert\") pod \"console-operator-58897d9998-927gg\" (UID: \"54bc06f9-fb4d-496c-af11-0d4acf39f27b\") " pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154291 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154316 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4j5s\" (UniqueName: \"kubernetes.io/projected/9cc0e821-77ea-4840-be3d-1165904bf50d-kube-api-access-z4j5s\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154336 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e81f7ca-2bc8-4d14-a101-e73361300228-images\") pod \"machine-api-operator-5694c8668f-lt4t4\" (UID: \"1e81f7ca-2bc8-4d14-a101-e73361300228\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154357 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-image-import-ca\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154383 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9cc0e821-77ea-4840-be3d-1165904bf50d-audit-dir\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154429 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccbwc\" (UniqueName: \"kubernetes.io/projected/35ad23c6-6d86-4e4f-b642-336f47fe999c-kube-api-access-ccbwc\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154452 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154475 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154497 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9gtz\" (UniqueName: \"kubernetes.io/projected/d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53-kube-api-access-p9gtz\") pod \"openshift-controller-manager-operator-756b6f6bc6-9np25\" (UID: \"d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154516 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154539 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54bc06f9-fb4d-496c-af11-0d4acf39f27b-config\") pod \"console-operator-58897d9998-927gg\" (UID: \"54bc06f9-fb4d-496c-af11-0d4acf39f27b\") " pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154558 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-audit-policies\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154604 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154602 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-config\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154864 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154884 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7afea08-2815-437c-b5ce-26e40f80edda-audit-dir\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154899 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-oauth-serving-cert\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154917 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9np25\" (UID: \"d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154931 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154962 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7afea08-2815-437c-b5ce-26e40f80edda-serving-cert\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154980 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9np25\" (UID: \"d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.154993 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-config\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155011 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155028 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155049 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54bc06f9-fb4d-496c-af11-0d4acf39f27b-trusted-ca\") pod \"console-operator-58897d9998-927gg\" (UID: \"54bc06f9-fb4d-496c-af11-0d4acf39f27b\") " pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155082 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7afea08-2815-437c-b5ce-26e40f80edda-encryption-config\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155096 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf7g8\" (UniqueName: \"kubernetes.io/projected/54bc06f9-fb4d-496c-af11-0d4acf39f27b-kube-api-access-rf7g8\") pod \"console-operator-58897d9998-927gg\" (UID: \"54bc06f9-fb4d-496c-af11-0d4acf39f27b\") " pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155110 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-oauth-config\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155126 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp88p\" (UniqueName: \"kubernetes.io/projected/c7afea08-2815-437c-b5ce-26e40f80edda-kube-api-access-hp88p\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155143 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-audit\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155158 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfn4x\" (UniqueName: \"kubernetes.io/projected/1e81f7ca-2bc8-4d14-a101-e73361300228-kube-api-access-jfn4x\") pod \"machine-api-operator-5694c8668f-lt4t4\" (UID: \"1e81f7ca-2bc8-4d14-a101-e73361300228\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155173 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-trusted-ca-bundle\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155193 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-etcd-serving-ca\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155210 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c7afea08-2815-437c-b5ce-26e40f80edda-node-pullsecrets\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155247 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155262 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-serving-cert\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155278 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7afea08-2815-437c-b5ce-26e40f80edda-etcd-client\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155295 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e81f7ca-2bc8-4d14-a101-e73361300228-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lt4t4\" (UID: \"1e81f7ca-2bc8-4d14-a101-e73361300228\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155517 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.155707 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.157260 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.157265 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.157417 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.157423 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.157498 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.157791 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.157836 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.158524 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.159695 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.159815 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.185624 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.187099 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.192828 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t47fw"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.193605 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.195749 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.196112 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.196295 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.196394 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.199546 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.203432 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.205756 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.206151 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.206364 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.206368 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.206392 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.206767 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.207272 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.207883 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.208397 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.217253 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-n4vzm"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.218587 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-29nrh"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.218886 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.220244 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.221648 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.236507 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.267711 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.267963 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.269724 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54bc06f9-fb4d-496c-af11-0d4acf39f27b-serving-cert\") pod \"console-operator-58897d9998-927gg\" (UID: \"54bc06f9-fb4d-496c-af11-0d4acf39f27b\") " pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275526 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4j5s\" (UniqueName: \"kubernetes.io/projected/9cc0e821-77ea-4840-be3d-1165904bf50d-kube-api-access-z4j5s\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e81f7ca-2bc8-4d14-a101-e73361300228-images\") pod \"machine-api-operator-5694c8668f-lt4t4\" (UID: \"1e81f7ca-2bc8-4d14-a101-e73361300228\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275575 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275593 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-image-import-ca\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275609 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9cc0e821-77ea-4840-be3d-1165904bf50d-audit-dir\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275628 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275644 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccbwc\" (UniqueName: \"kubernetes.io/projected/35ad23c6-6d86-4e4f-b642-336f47fe999c-kube-api-access-ccbwc\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275668 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9gtz\" (UniqueName: \"kubernetes.io/projected/d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53-kube-api-access-p9gtz\") pod \"openshift-controller-manager-operator-756b6f6bc6-9np25\" (UID: \"d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275713 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54bc06f9-fb4d-496c-af11-0d4acf39f27b-config\") pod \"console-operator-58897d9998-927gg\" (UID: \"54bc06f9-fb4d-496c-af11-0d4acf39f27b\") " pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275757 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-audit-policies\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275777 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275798 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-config\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275820 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275840 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-oauth-serving-cert\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275861 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7afea08-2815-437c-b5ce-26e40f80edda-audit-dir\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275880 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9np25\" (UID: \"d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275895 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275920 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-config\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7afea08-2815-437c-b5ce-26e40f80edda-serving-cert\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275956 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9np25\" (UID: \"d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275977 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.275999 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276018 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54bc06f9-fb4d-496c-af11-0d4acf39f27b-trusted-ca\") pod \"console-operator-58897d9998-927gg\" (UID: \"54bc06f9-fb4d-496c-af11-0d4acf39f27b\") " pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276046 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7afea08-2815-437c-b5ce-26e40f80edda-encryption-config\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276080 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf7g8\" (UniqueName: \"kubernetes.io/projected/54bc06f9-fb4d-496c-af11-0d4acf39f27b-kube-api-access-rf7g8\") pod \"console-operator-58897d9998-927gg\" (UID: \"54bc06f9-fb4d-496c-af11-0d4acf39f27b\") " pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276096 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-oauth-config\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276112 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp88p\" (UniqueName: \"kubernetes.io/projected/c7afea08-2815-437c-b5ce-26e40f80edda-kube-api-access-hp88p\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276126 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfn4x\" (UniqueName: \"kubernetes.io/projected/1e81f7ca-2bc8-4d14-a101-e73361300228-kube-api-access-jfn4x\") pod \"machine-api-operator-5694c8668f-lt4t4\" (UID: \"1e81f7ca-2bc8-4d14-a101-e73361300228\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-audit\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276159 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-trusted-ca-bundle\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276178 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-etcd-serving-ca\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276195 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276213 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c7afea08-2815-437c-b5ce-26e40f80edda-node-pullsecrets\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276233 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276265 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-serving-cert\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276282 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7afea08-2815-437c-b5ce-26e40f80edda-etcd-client\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276303 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e81f7ca-2bc8-4d14-a101-e73361300228-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lt4t4\" (UID: \"1e81f7ca-2bc8-4d14-a101-e73361300228\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276321 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1d95364-40e4-46a6-a2de-3a94a8cda31e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qp66p\" (UID: \"a1d95364-40e4-46a6-a2de-3a94a8cda31e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e81f7ca-2bc8-4d14-a101-e73361300228-config\") pod \"machine-api-operator-5694c8668f-lt4t4\" (UID: \"1e81f7ca-2bc8-4d14-a101-e73361300228\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276356 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbnk\" (UniqueName: \"kubernetes.io/projected/a1d95364-40e4-46a6-a2de-3a94a8cda31e-kube-api-access-fzbnk\") pod \"cluster-samples-operator-665b6dd947-qp66p\" (UID: \"a1d95364-40e4-46a6-a2de-3a94a8cda31e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.276370 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-service-ca\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.277302 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-service-ca\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.270894 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lt4t4"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.277355 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-r9bvz"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.277869 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7wm6j"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.278226 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.280232 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.270964 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-29nrh" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.281006 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.281181 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e81f7ca-2bc8-4d14-a101-e73361300228-images\") pod \"machine-api-operator-5694c8668f-lt4t4\" (UID: \"1e81f7ca-2bc8-4d14-a101-e73361300228\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.283042 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.283568 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-image-import-ca\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.283929 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.284126 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54bc06f9-fb4d-496c-af11-0d4acf39f27b-serving-cert\") pod \"console-operator-58897d9998-927gg\" (UID: \"54bc06f9-fb4d-496c-af11-0d4acf39f27b\") " pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.284165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-oauth-serving-cert\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.284729 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9np25\" (UID: \"d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.285149 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54bc06f9-fb4d-496c-af11-0d4acf39f27b-trusted-ca\") pod \"console-operator-58897d9998-927gg\" (UID: \"54bc06f9-fb4d-496c-af11-0d4acf39f27b\") " pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.285405 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7afea08-2815-437c-b5ce-26e40f80edda-audit-dir\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.288241 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-audit\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.288778 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.289305 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-config\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.289515 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r9bvz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.289605 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.289843 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9cc0e821-77ea-4840-be3d-1165904bf50d-audit-dir\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.292618 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.292959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-audit-policies\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.293227 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-oauth-config\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.294104 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7afea08-2815-437c-b5ce-26e40f80edda-serving-cert\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.294301 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9np25\" (UID: \"d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.294632 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.295844 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.296684 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-config\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.296910 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.297742 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.299298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-trusted-ca-bundle\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.299573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e81f7ca-2bc8-4d14-a101-e73361300228-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lt4t4\" (UID: \"1e81f7ca-2bc8-4d14-a101-e73361300228\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.299888 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54bc06f9-fb4d-496c-af11-0d4acf39f27b-config\") pod \"console-operator-58897d9998-927gg\" (UID: \"54bc06f9-fb4d-496c-af11-0d4acf39f27b\") " pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.300262 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e81f7ca-2bc8-4d14-a101-e73361300228-config\") pod \"machine-api-operator-5694c8668f-lt4t4\" (UID: \"1e81f7ca-2bc8-4d14-a101-e73361300228\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.300376 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.300763 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c7afea08-2815-437c-b5ce-26e40f80edda-node-pullsecrets\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.301745 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cd5qk"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.303542 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.303826 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.304094 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.304152 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7afea08-2815-437c-b5ce-26e40f80edda-encryption-config\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.308669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7afea08-2815-437c-b5ce-26e40f80edda-etcd-serving-ca\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.308686 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a1d95364-40e4-46a6-a2de-3a94a8cda31e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qp66p\" (UID: \"a1d95364-40e4-46a6-a2de-3a94a8cda31e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.307955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7afea08-2815-437c-b5ce-26e40f80edda-etcd-client\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.312035 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.315519 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.316337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-serving-cert\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.316735 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.316855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.317756 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.318424 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.319106 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-jcsqb"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.319683 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2gkmm"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.319914 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.319935 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.320722 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.321048 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.321231 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jcsqb" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.321359 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2gkmm" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.321657 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.328351 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.328910 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.332846 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.333341 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.334633 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.336369 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.340943 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.345346 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.345942 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.350960 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.358928 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.359547 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.359825 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6j7l4"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.359903 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.360292 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.367114 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hn4rz"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.368002 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-927gg"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.369677 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5trjp"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.372108 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.375226 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.377783 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pfzm8"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.379910 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m4gx9"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.381182 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.383213 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.384200 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.386885 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wc7z5"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.388673 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.393213 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.393890 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-29nrh"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.395326 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-r9bvz"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.397271 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6bg4z"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.399432 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.402438 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t47fw"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.407512 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.409184 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cd5qk"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.409816 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.413430 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.415597 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.417733 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.419066 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6kz6b"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.419855 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6kz6b" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.420356 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.422065 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7wm6j"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.423728 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.425342 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.426796 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.426960 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.428377 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.430169 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.431929 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wcpvk"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.432891 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.433173 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.434974 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2gkmm"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.436716 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.438304 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6kz6b"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.439822 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wcpvk"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.441389 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dp2zw"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.441976 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dp2zw" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.442833 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dp2zw"] Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.447433 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.466921 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.486855 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.508740 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.527030 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.547669 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.567337 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.587430 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.606430 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.627795 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.647608 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.668333 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.687108 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.706871 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.726886 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.747889 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.767763 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.787549 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.807733 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.827179 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.849833 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.866811 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.887463 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.907288 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.946984 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4j5s\" (UniqueName: \"kubernetes.io/projected/9cc0e821-77ea-4840-be3d-1165904bf50d-kube-api-access-z4j5s\") pod \"oauth-openshift-558db77b4-hn4rz\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.947279 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.967877 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 01 16:04:47 crc kubenswrapper[4764]: I1001 16:04:47.989269 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.020220 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf7g8\" (UniqueName: \"kubernetes.io/projected/54bc06f9-fb4d-496c-af11-0d4acf39f27b-kube-api-access-rf7g8\") pod \"console-operator-58897d9998-927gg\" (UID: \"54bc06f9-fb4d-496c-af11-0d4acf39f27b\") " pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.022814 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.047699 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.051925 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbnk\" (UniqueName: \"kubernetes.io/projected/a1d95364-40e4-46a6-a2de-3a94a8cda31e-kube-api-access-fzbnk\") pod \"cluster-samples-operator-665b6dd947-qp66p\" (UID: \"a1d95364-40e4-46a6-a2de-3a94a8cda31e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.067425 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.086943 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.106804 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.127908 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.146178 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.147254 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.187699 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.188902 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9gtz\" (UniqueName: \"kubernetes.io/projected/d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53-kube-api-access-p9gtz\") pod \"openshift-controller-manager-operator-756b6f6bc6-9np25\" (UID: \"d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.225177 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp88p\" (UniqueName: \"kubernetes.io/projected/c7afea08-2815-437c-b5ce-26e40f80edda-kube-api-access-hp88p\") pod \"apiserver-76f77b778f-6bg4z\" (UID: \"c7afea08-2815-437c-b5ce-26e40f80edda\") " pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.246825 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfn4x\" (UniqueName: \"kubernetes.io/projected/1e81f7ca-2bc8-4d14-a101-e73361300228-kube-api-access-jfn4x\") pod \"machine-api-operator-5694c8668f-lt4t4\" (UID: \"1e81f7ca-2bc8-4d14-a101-e73361300228\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.264416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccbwc\" (UniqueName: \"kubernetes.io/projected/35ad23c6-6d86-4e4f-b642-336f47fe999c-kube-api-access-ccbwc\") pod \"console-f9d7485db-pfzm8\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.268528 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.287573 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.296589 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.301421 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hn4rz"] Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.305806 4764 request.go:700] Waited for 1.005113797s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/secrets?fieldSelector=metadata.name%3Dkube-scheduler-operator-serving-cert&limit=500&resourceVersion=0 Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.307988 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 01 16:04:48 crc kubenswrapper[4764]: W1001 16:04:48.308655 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cc0e821_77ea_4840_be3d_1165904bf50d.slice/crio-9f7f0431c4b4cf35921600b65b7eb4ec151e735e08ec8d7cf2a665a196ca2ebc WatchSource:0}: Error finding container 9f7f0431c4b4cf35921600b65b7eb4ec151e735e08ec8d7cf2a665a196ca2ebc: Status 404 returned error can't find the container with id 9f7f0431c4b4cf35921600b65b7eb4ec151e735e08ec8d7cf2a665a196ca2ebc Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.310217 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.327467 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.339660 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.346900 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p"] Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.348311 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.368294 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.370530 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.388836 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.407821 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.420687 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" event={"ID":"9cc0e821-77ea-4840-be3d-1165904bf50d","Type":"ContainerStarted","Data":"9f7f0431c4b4cf35921600b65b7eb4ec151e735e08ec8d7cf2a665a196ca2ebc"} Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.427800 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.430463 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.471423 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.482851 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-927gg"] Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.487138 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.489339 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aedccabf-870c-4bca-9d09-028aa2416702-serving-cert\") pod \"route-controller-manager-6576b87f9c-n5qtc\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502130 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd433de1-a494-45e1-9a19-1b619fe7c3bc-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vdnsd\" (UID: \"dd433de1-a494-45e1-9a19-1b619fe7c3bc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502155 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7bd2b073-6d2e-4de8-b164-f853c7e01794-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502175 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvng2\" (UniqueName: \"kubernetes.io/projected/964956b3-18e0-4894-b0c0-960a6770384e-kube-api-access-jvng2\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502199 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/964956b3-18e0-4894-b0c0-960a6770384e-service-ca-bundle\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7bd2b073-6d2e-4de8-b164-f853c7e01794-audit-dir\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502242 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-registry-tls\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502258 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpj9c\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-kube-api-access-qpj9c\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502275 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd433de1-a494-45e1-9a19-1b619fe7c3bc-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vdnsd\" (UID: \"dd433de1-a494-45e1-9a19-1b619fe7c3bc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502292 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z668p\" (UniqueName: \"kubernetes.io/projected/fb95f0bc-df36-46d1-9c1e-39f3bdb97735-kube-api-access-z668p\") pod \"machine-approver-56656f9798-ph942\" (UID: \"fb95f0bc-df36-46d1-9c1e-39f3bdb97735\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502310 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m62rq\" (UniqueName: \"kubernetes.io/projected/807b336c-8d01-4644-8c75-1e9da97c0d23-kube-api-access-m62rq\") pod \"openshift-apiserver-operator-796bbdcf4f-sklbl\" (UID: \"807b336c-8d01-4644-8c75-1e9da97c0d23\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502324 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb95f0bc-df36-46d1-9c1e-39f3bdb97735-config\") pod \"machine-approver-56656f9798-ph942\" (UID: \"fb95f0bc-df36-46d1-9c1e-39f3bdb97735\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502346 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502360 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807b336c-8d01-4644-8c75-1e9da97c0d23-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-sklbl\" (UID: \"807b336c-8d01-4644-8c75-1e9da97c0d23\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502383 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-config\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502397 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502412 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd6rs\" (UniqueName: \"kubernetes.io/projected/7bd2b073-6d2e-4de8-b164-f853c7e01794-kube-api-access-qd6rs\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502450 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bd701ac0-1755-4925-97c5-d0def443d990-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nmtgg\" (UID: \"bd701ac0-1755-4925-97c5-d0def443d990\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502471 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd433de1-a494-45e1-9a19-1b619fe7c3bc-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vdnsd\" (UID: \"dd433de1-a494-45e1-9a19-1b619fe7c3bc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502493 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mphl7\" (UniqueName: \"kubernetes.io/projected/dd433de1-a494-45e1-9a19-1b619fe7c3bc-kube-api-access-mphl7\") pod \"cluster-image-registry-operator-dc59b4c8b-vdnsd\" (UID: \"dd433de1-a494-45e1-9a19-1b619fe7c3bc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502508 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fb95f0bc-df36-46d1-9c1e-39f3bdb97735-machine-approver-tls\") pod \"machine-approver-56656f9798-ph942\" (UID: \"fb95f0bc-df36-46d1-9c1e-39f3bdb97735\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502527 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-registry-certificates\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502545 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dc737f5-37f4-47a3-8716-af033cbe27fc-serving-cert\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502560 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bd2b073-6d2e-4de8-b164-f853c7e01794-serving-cert\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502578 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkdbw\" (UniqueName: \"kubernetes.io/projected/bd701ac0-1755-4925-97c5-d0def443d990-kube-api-access-rkdbw\") pod \"openshift-config-operator-7777fb866f-nmtgg\" (UID: \"bd701ac0-1755-4925-97c5-d0def443d990\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502594 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/964956b3-18e0-4894-b0c0-960a6770384e-config\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502612 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54tbr\" (UniqueName: \"kubernetes.io/projected/1dc737f5-37f4-47a3-8716-af033cbe27fc-kube-api-access-54tbr\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502644 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd701ac0-1755-4925-97c5-d0def443d990-serving-cert\") pod \"openshift-config-operator-7777fb866f-nmtgg\" (UID: \"bd701ac0-1755-4925-97c5-d0def443d990\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502660 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7bd2b073-6d2e-4de8-b164-f853c7e01794-encryption-config\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502680 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-586jt\" (UniqueName: \"kubernetes.io/projected/aedccabf-870c-4bca-9d09-028aa2416702-kube-api-access-586jt\") pod \"route-controller-manager-6576b87f9c-n5qtc\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502698 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502712 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807b336c-8d01-4644-8c75-1e9da97c0d23-config\") pod \"openshift-apiserver-operator-796bbdcf4f-sklbl\" (UID: \"807b336c-8d01-4644-8c75-1e9da97c0d23\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502728 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-trusted-ca\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502744 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7bd2b073-6d2e-4de8-b164-f853c7e01794-audit-policies\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502761 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-bound-sa-token\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdr4k\" (UniqueName: \"kubernetes.io/projected/ce380ecc-2685-4ceb-85f6-617c8f7c0eaa-kube-api-access-xdr4k\") pod \"downloads-7954f5f757-5trjp\" (UID: \"ce380ecc-2685-4ceb-85f6-617c8f7c0eaa\") " pod="openshift-console/downloads-7954f5f757-5trjp" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502808 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502826 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aedccabf-870c-4bca-9d09-028aa2416702-client-ca\") pod \"route-controller-manager-6576b87f9c-n5qtc\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502841 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-client-ca\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502854 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bd2b073-6d2e-4de8-b164-f853c7e01794-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb95f0bc-df36-46d1-9c1e-39f3bdb97735-auth-proxy-config\") pod \"machine-approver-56656f9798-ph942\" (UID: \"fb95f0bc-df36-46d1-9c1e-39f3bdb97735\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502885 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aedccabf-870c-4bca-9d09-028aa2416702-config\") pod \"route-controller-manager-6576b87f9c-n5qtc\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502901 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/964956b3-18e0-4894-b0c0-960a6770384e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502920 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7bd2b073-6d2e-4de8-b164-f853c7e01794-etcd-client\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.502934 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/964956b3-18e0-4894-b0c0-960a6770384e-serving-cert\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: E1001 16:04:48.504528 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.004509721 +0000 UTC m=+152.004156556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.507786 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.521967 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25"] Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.532943 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.560568 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.567297 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 01 16:04:48 crc kubenswrapper[4764]: W1001 16:04:48.568553 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0dcc1bb_7a8b_4ff4_9b57_1f366baf7f53.slice/crio-e67a0e307602ac84a9cafb51086241d27010edee7fa6a9805a7675ec263198d7 WatchSource:0}: Error finding container e67a0e307602ac84a9cafb51086241d27010edee7fa6a9805a7675ec263198d7: Status 404 returned error can't find the container with id e67a0e307602ac84a9cafb51086241d27010edee7fa6a9805a7675ec263198d7 Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.585103 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pfzm8"] Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.587503 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 01 16:04:48 crc kubenswrapper[4764]: W1001 16:04:48.594287 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35ad23c6_6d86_4e4f_b642_336f47fe999c.slice/crio-cbe3f3eb334fefead5e7a40cb6e06c52cd12a66c3e36343b0362fe84a3deac4d WatchSource:0}: Error finding container cbe3f3eb334fefead5e7a40cb6e06c52cd12a66c3e36343b0362fe84a3deac4d: Status 404 returned error can't find the container with id cbe3f3eb334fefead5e7a40cb6e06c52cd12a66c3e36343b0362fe84a3deac4d Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.603468 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.603650 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fb95f0bc-df36-46d1-9c1e-39f3bdb97735-machine-approver-tls\") pod \"machine-approver-56656f9798-ph942\" (UID: \"fb95f0bc-df36-46d1-9c1e-39f3bdb97735\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.603683 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/972c24b8-8c37-4214-b0f0-6449046a3eca-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4tcjp\" (UID: \"972c24b8-8c37-4214-b0f0-6449046a3eca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" Oct 01 16:04:48 crc kubenswrapper[4764]: E1001 16:04:48.605141 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.10511283 +0000 UTC m=+152.104759675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.605204 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c199b404-aae6-4d71-8aab-022be036624d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sv6hf\" (UID: \"c199b404-aae6-4d71-8aab-022be036624d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.605402 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-registry-certificates\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.605429 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bd2b073-6d2e-4de8-b164-f853c7e01794-serving-cert\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.605449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdt5h\" (UniqueName: \"kubernetes.io/projected/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-kube-api-access-qdt5h\") pod \"collect-profiles-29322240-f5k2s\" (UID: \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.605465 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fdbf81ba-852e-4222-8ed8-efd4dbd27a80-apiservice-cert\") pod \"packageserver-d55dfcdfc-qb64g\" (UID: \"fdbf81ba-852e-4222-8ed8-efd4dbd27a80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.605481 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wwfw\" (UniqueName: \"kubernetes.io/projected/c04e7251-fb9c-4ba7-b06b-7c058c6b0859-kube-api-access-5wwfw\") pod \"migrator-59844c95c7-r9bvz\" (UID: \"c04e7251-fb9c-4ba7-b06b-7c058c6b0859\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r9bvz" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.605522 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkdbw\" (UniqueName: \"kubernetes.io/projected/bd701ac0-1755-4925-97c5-d0def443d990-kube-api-access-rkdbw\") pod \"openshift-config-operator-7777fb866f-nmtgg\" (UID: \"bd701ac0-1755-4925-97c5-d0def443d990\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.605538 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br89k\" (UniqueName: \"kubernetes.io/projected/d333f596-88f5-4641-b577-fd416e45c25d-kube-api-access-br89k\") pod \"dns-default-6kz6b\" (UID: \"d333f596-88f5-4641-b577-fd416e45c25d\") " pod="openshift-dns/dns-default-6kz6b" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.605555 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54tbr\" (UniqueName: \"kubernetes.io/projected/1dc737f5-37f4-47a3-8716-af033cbe27fc-kube-api-access-54tbr\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.605571 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58b6f788-e88b-4751-b06d-3fb68a316f91-service-ca-bundle\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.605589 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ca157967-0878-4ffe-a609-6c4be43a9ee0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2gkmm\" (UID: \"ca157967-0878-4ffe-a609-6c4be43a9ee0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2gkmm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.605732 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/83c8dab0-d64e-45be-b358-0baaaf1eca60-srv-cert\") pod \"olm-operator-6b444d44fb-bzmh9\" (UID: \"83c8dab0-d64e-45be-b358-0baaaf1eca60\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.605768 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd701ac0-1755-4925-97c5-d0def443d990-serving-cert\") pod \"openshift-config-operator-7777fb866f-nmtgg\" (UID: \"bd701ac0-1755-4925-97c5-d0def443d990\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.609542 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98d6aa4d-7d9b-4b28-82a8-75be1d101b6c-metrics-tls\") pod \"dns-operator-744455d44c-29nrh\" (UID: \"98d6aa4d-7d9b-4b28-82a8-75be1d101b6c\") " pod="openshift-dns-operator/dns-operator-744455d44c-29nrh" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.610520 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/972c24b8-8c37-4214-b0f0-6449046a3eca-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4tcjp\" (UID: \"972c24b8-8c37-4214-b0f0-6449046a3eca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.610577 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/83c8dab0-d64e-45be-b358-0baaaf1eca60-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bzmh9\" (UID: \"83c8dab0-d64e-45be-b358-0baaaf1eca60\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.610634 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkqg5\" (UniqueName: \"kubernetes.io/projected/01e2a53e-c853-4151-9c6c-a81917e54bf0-kube-api-access-fkqg5\") pod \"ingress-operator-5b745b69d9-tdrhc\" (UID: \"01e2a53e-c853-4151-9c6c-a81917e54bf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.610668 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5lr7\" (UniqueName: \"kubernetes.io/projected/25878fb0-4dc8-47aa-b15d-ad43c06319a3-kube-api-access-t5lr7\") pod \"machine-config-server-jcsqb\" (UID: \"25878fb0-4dc8-47aa-b15d-ad43c06319a3\") " pod="openshift-machine-config-operator/machine-config-server-jcsqb" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.610736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-586jt\" (UniqueName: \"kubernetes.io/projected/aedccabf-870c-4bca-9d09-028aa2416702-kube-api-access-586jt\") pod \"route-controller-manager-6576b87f9c-n5qtc\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.610815 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.611109 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.611203 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807b336c-8d01-4644-8c75-1e9da97c0d23-config\") pod \"openshift-apiserver-operator-796bbdcf4f-sklbl\" (UID: \"807b336c-8d01-4644-8c75-1e9da97c0d23\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612119 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7bd2b073-6d2e-4de8-b164-f853c7e01794-audit-policies\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612210 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-mountpoint-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612250 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snwcn\" (UniqueName: \"kubernetes.io/projected/4cc85075-9ddf-47e1-9564-615f6f5f26ed-kube-api-access-snwcn\") pod \"ingress-canary-dp2zw\" (UID: \"4cc85075-9ddf-47e1-9564-615f6f5f26ed\") " pod="openshift-ingress-canary/ingress-canary-dp2zw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612369 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-bound-sa-token\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612405 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-registration-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612454 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5101da2b-8945-460a-951e-a61632f28c98-serving-cert\") pod \"service-ca-operator-777779d784-mrdz7\" (UID: \"5101da2b-8945-460a-951e-a61632f28c98\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612490 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aedccabf-870c-4bca-9d09-028aa2416702-client-ca\") pod \"route-controller-manager-6576b87f9c-n5qtc\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612599 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bd2b073-6d2e-4de8-b164-f853c7e01794-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612632 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb95f0bc-df36-46d1-9c1e-39f3bdb97735-auth-proxy-config\") pod \"machine-approver-56656f9798-ph942\" (UID: \"fb95f0bc-df36-46d1-9c1e-39f3bdb97735\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612664 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fdbf81ba-852e-4222-8ed8-efd4dbd27a80-tmpfs\") pod \"packageserver-d55dfcdfc-qb64g\" (UID: \"fdbf81ba-852e-4222-8ed8-efd4dbd27a80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsxrs\" (UniqueName: \"kubernetes.io/projected/f32960fb-d6fb-41bf-8e3c-4c26e3dd80af-kube-api-access-rsxrs\") pod \"package-server-manager-789f6589d5-zwnjx\" (UID: \"f32960fb-d6fb-41bf-8e3c-4c26e3dd80af\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612724 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aedccabf-870c-4bca-9d09-028aa2416702-config\") pod \"route-controller-manager-6576b87f9c-n5qtc\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612752 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/964956b3-18e0-4894-b0c0-960a6770384e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612784 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8qwk\" (UniqueName: \"kubernetes.io/projected/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-kube-api-access-q8qwk\") pod \"marketplace-operator-79b997595-cd5qk\" (UID: \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.612817 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15522425-a48b-41f0-8327-8ce597b09dd2-proxy-tls\") pod \"machine-config-operator-74547568cd-cf72j\" (UID: \"15522425-a48b-41f0-8327-8ce597b09dd2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.614788 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd701ac0-1755-4925-97c5-d0def443d990-serving-cert\") pod \"openshift-config-operator-7777fb866f-nmtgg\" (UID: \"bd701ac0-1755-4925-97c5-d0def443d990\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.615100 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-registry-certificates\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.615128 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b81be073-688e-4025-8973-fd9147b4a8fa-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mt8wt\" (UID: \"b81be073-688e-4025-8973-fd9147b4a8fa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.615854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7bd2b073-6d2e-4de8-b164-f853c7e01794-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.615889 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/58b6f788-e88b-4751-b06d-3fb68a316f91-default-certificate\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.615926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xpzw\" (UniqueName: \"kubernetes.io/projected/18323a84-6150-4a2d-bc8d-b29b4a0a7ddd-kube-api-access-7xpzw\") pod \"machine-config-controller-84d6567774-tdvpk\" (UID: \"18323a84-6150-4a2d-bc8d-b29b4a0a7ddd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.615960 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-secret-volume\") pod \"collect-profiles-29322240-f5k2s\" (UID: \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.616084 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aedccabf-870c-4bca-9d09-028aa2416702-client-ca\") pod \"route-controller-manager-6576b87f9c-n5qtc\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.616623 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7bd2b073-6d2e-4de8-b164-f853c7e01794-audit-dir\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.616632 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fb95f0bc-df36-46d1-9c1e-39f3bdb97735-auth-proxy-config\") pod \"machine-approver-56656f9798-ph942\" (UID: \"fb95f0bc-df36-46d1-9c1e-39f3bdb97735\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.616662 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c6d4d4d2-b14a-4159-9755-267ab804edb0-signing-key\") pod \"service-ca-9c57cc56f-7wm6j\" (UID: \"c6d4d4d2-b14a-4159-9755-267ab804edb0\") " pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.617282 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7bd2b073-6d2e-4de8-b164-f853c7e01794-audit-dir\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.617353 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/964956b3-18e0-4894-b0c0-960a6770384e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.617393 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/807b336c-8d01-4644-8c75-1e9da97c0d23-config\") pod \"openshift-apiserver-operator-796bbdcf4f-sklbl\" (UID: \"807b336c-8d01-4644-8c75-1e9da97c0d23\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.617528 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpj9c\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-kube-api-access-qpj9c\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.617927 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bd2b073-6d2e-4de8-b164-f853c7e01794-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.617989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb95f0bc-df36-46d1-9c1e-39f3bdb97735-config\") pod \"machine-approver-56656f9798-ph942\" (UID: \"fb95f0bc-df36-46d1-9c1e-39f3bdb97735\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.618226 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01e2a53e-c853-4151-9c6c-a81917e54bf0-metrics-tls\") pod \"ingress-operator-5b745b69d9-tdrhc\" (UID: \"01e2a53e-c853-4151-9c6c-a81917e54bf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.618263 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lswq2\" (UniqueName: \"kubernetes.io/projected/eccae5ea-5d95-4d65-97cd-9d8ee4db20bc-kube-api-access-lswq2\") pod \"control-plane-machine-set-operator-78cbb6b69f-lntzx\" (UID: \"eccae5ea-5d95-4d65-97cd-9d8ee4db20bc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.618315 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7bd2b073-6d2e-4de8-b164-f853c7e01794-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.618354 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/25878fb0-4dc8-47aa-b15d-ad43c06319a3-certs\") pod \"machine-config-server-jcsqb\" (UID: \"25878fb0-4dc8-47aa-b15d-ad43c06319a3\") " pod="openshift-machine-config-operator/machine-config-server-jcsqb" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.618394 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c199b404-aae6-4d71-8aab-022be036624d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sv6hf\" (UID: \"c199b404-aae6-4d71-8aab-022be036624d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.618401 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aedccabf-870c-4bca-9d09-028aa2416702-config\") pod \"route-controller-manager-6576b87f9c-n5qtc\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.618484 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.618539 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-config\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.618656 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb95f0bc-df36-46d1-9c1e-39f3bdb97735-config\") pod \"machine-approver-56656f9798-ph942\" (UID: \"fb95f0bc-df36-46d1-9c1e-39f3bdb97735\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.618681 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd6rs\" (UniqueName: \"kubernetes.io/projected/7bd2b073-6d2e-4de8-b164-f853c7e01794-kube-api-access-qd6rs\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.618719 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01e2a53e-c853-4151-9c6c-a81917e54bf0-trusted-ca\") pod \"ingress-operator-5b745b69d9-tdrhc\" (UID: \"01e2a53e-c853-4151-9c6c-a81917e54bf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.618750 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j65x\" (UniqueName: \"kubernetes.io/projected/c9b4fd74-b139-4155-920d-139eeda695d5-kube-api-access-5j65x\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.618754 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.619207 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9b4fd74-b139-4155-920d-139eeda695d5-etcd-service-ca\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.619417 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-config-volume\") pod \"collect-profiles-29322240-f5k2s\" (UID: \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.619452 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/25878fb0-4dc8-47aa-b15d-ad43c06319a3-node-bootstrap-token\") pod \"machine-config-server-jcsqb\" (UID: \"25878fb0-4dc8-47aa-b15d-ad43c06319a3\") " pod="openshift-machine-config-operator/machine-config-server-jcsqb" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.619477 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/15522425-a48b-41f0-8327-8ce597b09dd2-images\") pod \"machine-config-operator-74547568cd-cf72j\" (UID: \"15522425-a48b-41f0-8327-8ce597b09dd2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.619829 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bd701ac0-1755-4925-97c5-d0def443d990-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nmtgg\" (UID: \"bd701ac0-1755-4925-97c5-d0def443d990\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.619872 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd433de1-a494-45e1-9a19-1b619fe7c3bc-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vdnsd\" (UID: \"dd433de1-a494-45e1-9a19-1b619fe7c3bc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.620295 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mphl7\" (UniqueName: \"kubernetes.io/projected/dd433de1-a494-45e1-9a19-1b619fe7c3bc-kube-api-access-mphl7\") pod \"cluster-image-registry-operator-dc59b4c8b-vdnsd\" (UID: \"dd433de1-a494-45e1-9a19-1b619fe7c3bc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.620332 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-plugins-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.620364 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-csi-data-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.620391 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p55hq\" (UniqueName: \"kubernetes.io/projected/c6d4d4d2-b14a-4159-9755-267ab804edb0-kube-api-access-p55hq\") pod \"service-ca-9c57cc56f-7wm6j\" (UID: \"c6d4d4d2-b14a-4159-9755-267ab804edb0\") " pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.620421 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.620441 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/15522425-a48b-41f0-8327-8ce597b09dd2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-cf72j\" (UID: \"15522425-a48b-41f0-8327-8ce597b09dd2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.621848 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7bd2b073-6d2e-4de8-b164-f853c7e01794-audit-policies\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.622615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dc737f5-37f4-47a3-8716-af033cbe27fc-serving-cert\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.622734 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bd2b073-6d2e-4de8-b164-f853c7e01794-serving-cert\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.623014 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bd701ac0-1755-4925-97c5-d0def443d990-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nmtgg\" (UID: \"bd701ac0-1755-4925-97c5-d0def443d990\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.623086 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/964956b3-18e0-4894-b0c0-960a6770384e-config\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.623202 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2729\" (UniqueName: \"kubernetes.io/projected/fdbf81ba-852e-4222-8ed8-efd4dbd27a80-kube-api-access-n2729\") pod \"packageserver-d55dfcdfc-qb64g\" (UID: \"fdbf81ba-852e-4222-8ed8-efd4dbd27a80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.623767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d333f596-88f5-4641-b577-fd416e45c25d-config-volume\") pod \"dns-default-6kz6b\" (UID: \"d333f596-88f5-4641-b577-fd416e45c25d\") " pod="openshift-dns/dns-default-6kz6b" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.624017 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84x4w\" (UniqueName: \"kubernetes.io/projected/5101da2b-8945-460a-951e-a61632f28c98-kube-api-access-84x4w\") pod \"service-ca-operator-777779d784-mrdz7\" (UID: \"5101da2b-8945-460a-951e-a61632f28c98\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.624072 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcwm4\" (UniqueName: \"kubernetes.io/projected/16b03fe5-5fd8-460b-b4a1-0011c47bf79b-kube-api-access-lcwm4\") pod \"catalog-operator-68c6474976-jgb79\" (UID: \"16b03fe5-5fd8-460b-b4a1-0011c47bf79b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.624326 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-config\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.624633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/964956b3-18e0-4894-b0c0-960a6770384e-config\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.624751 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7bd2b073-6d2e-4de8-b164-f853c7e01794-encryption-config\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.624816 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/eccae5ea-5d95-4d65-97cd-9d8ee4db20bc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lntzx\" (UID: \"eccae5ea-5d95-4d65-97cd-9d8ee4db20bc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.625232 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74c0b14c-2e43-4b8c-8bba-eb597ffc13ce-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-d9h6p\" (UID: \"74c0b14c-2e43-4b8c-8bba-eb597ffc13ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.625268 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c0b14c-2e43-4b8c-8bba-eb597ffc13ce-config\") pod \"kube-controller-manager-operator-78b949d7b-d9h6p\" (UID: \"74c0b14c-2e43-4b8c-8bba-eb597ffc13ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.625330 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzqv6\" (UniqueName: \"kubernetes.io/projected/ca157967-0878-4ffe-a609-6c4be43a9ee0-kube-api-access-lzqv6\") pod \"multus-admission-controller-857f4d67dd-2gkmm\" (UID: \"ca157967-0878-4ffe-a609-6c4be43a9ee0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2gkmm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.625331 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lt4t4"] Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.625392 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-trusted-ca\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.625422 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b81be073-688e-4025-8973-fd9147b4a8fa-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mt8wt\" (UID: \"b81be073-688e-4025-8973-fd9147b4a8fa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.625476 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw2cl\" (UniqueName: \"kubernetes.io/projected/972c24b8-8c37-4214-b0f0-6449046a3eca-kube-api-access-rw2cl\") pod \"kube-storage-version-migrator-operator-b67b599dd-4tcjp\" (UID: \"972c24b8-8c37-4214-b0f0-6449046a3eca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.625503 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdr4k\" (UniqueName: \"kubernetes.io/projected/ce380ecc-2685-4ceb-85f6-617c8f7c0eaa-kube-api-access-xdr4k\") pod \"downloads-7954f5f757-5trjp\" (UID: \"ce380ecc-2685-4ceb-85f6-617c8f7c0eaa\") " pod="openshift-console/downloads-7954f5f757-5trjp" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.625862 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.625930 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b4fd74-b139-4155-920d-139eeda695d5-config\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.625953 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d333f596-88f5-4641-b577-fd416e45c25d-metrics-tls\") pod \"dns-default-6kz6b\" (UID: \"d333f596-88f5-4641-b577-fd416e45c25d\") " pod="openshift-dns/dns-default-6kz6b" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.625990 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-554wf\" (UniqueName: \"kubernetes.io/projected/83c8dab0-d64e-45be-b358-0baaaf1eca60-kube-api-access-554wf\") pod \"olm-operator-6b444d44fb-bzmh9\" (UID: \"83c8dab0-d64e-45be-b358-0baaaf1eca60\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.626026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-client-ca\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.626047 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c9b4fd74-b139-4155-920d-139eeda695d5-etcd-client\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.626106 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqlkv\" (UniqueName: \"kubernetes.io/projected/98d6aa4d-7d9b-4b28-82a8-75be1d101b6c-kube-api-access-sqlkv\") pod \"dns-operator-744455d44c-29nrh\" (UID: \"98d6aa4d-7d9b-4b28-82a8-75be1d101b6c\") " pod="openshift-dns-operator/dns-operator-744455d44c-29nrh" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.626129 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8bln\" (UniqueName: \"kubernetes.io/projected/fb9458af-3147-4e40-8816-2d50eeaee101-kube-api-access-c8bln\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: E1001 16:04:48.626238 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.126226168 +0000 UTC m=+152.125873003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.626379 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cc85075-9ddf-47e1-9564-615f6f5f26ed-cert\") pod \"ingress-canary-dp2zw\" (UID: \"4cc85075-9ddf-47e1-9564-615f6f5f26ed\") " pod="openshift-ingress-canary/ingress-canary-dp2zw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.626417 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/18323a84-6150-4a2d-bc8d-b29b4a0a7ddd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tdvpk\" (UID: \"18323a84-6150-4a2d-bc8d-b29b4a0a7ddd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.626648 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.627866 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-client-ca\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.628746 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qclhk\" (UniqueName: \"kubernetes.io/projected/58b6f788-e88b-4751-b06d-3fb68a316f91-kube-api-access-qclhk\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.629218 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7bd2b073-6d2e-4de8-b164-f853c7e01794-encryption-config\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630294 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-trusted-ca\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630514 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7bd2b073-6d2e-4de8-b164-f853c7e01794-etcd-client\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630553 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/964956b3-18e0-4894-b0c0-960a6770384e-serving-cert\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630582 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/58b6f788-e88b-4751-b06d-3fb68a316f91-stats-auth\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630607 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7br74\" (UniqueName: \"kubernetes.io/projected/15522425-a48b-41f0-8327-8ce597b09dd2-kube-api-access-7br74\") pod \"machine-config-operator-74547568cd-cf72j\" (UID: \"15522425-a48b-41f0-8327-8ce597b09dd2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630647 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fdbf81ba-852e-4222-8ed8-efd4dbd27a80-webhook-cert\") pod \"packageserver-d55dfcdfc-qb64g\" (UID: \"fdbf81ba-852e-4222-8ed8-efd4dbd27a80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630672 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/16b03fe5-5fd8-460b-b4a1-0011c47bf79b-srv-cert\") pod \"catalog-operator-68c6474976-jgb79\" (UID: \"16b03fe5-5fd8-460b-b4a1-0011c47bf79b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aedccabf-870c-4bca-9d09-028aa2416702-serving-cert\") pod \"route-controller-manager-6576b87f9c-n5qtc\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630755 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c9b4fd74-b139-4155-920d-139eeda695d5-etcd-ca\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630785 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74c0b14c-2e43-4b8c-8bba-eb597ffc13ce-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-d9h6p\" (UID: \"74c0b14c-2e43-4b8c-8bba-eb597ffc13ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630864 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd433de1-a494-45e1-9a19-1b619fe7c3bc-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vdnsd\" (UID: \"dd433de1-a494-45e1-9a19-1b619fe7c3bc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630888 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9b4fd74-b139-4155-920d-139eeda695d5-serving-cert\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630914 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cd5qk\" (UID: \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630932 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cd5qk\" (UID: \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630961 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c199b404-aae6-4d71-8aab-022be036624d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sv6hf\" (UID: \"c199b404-aae6-4d71-8aab-022be036624d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.630991 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvng2\" (UniqueName: \"kubernetes.io/projected/964956b3-18e0-4894-b0c0-960a6770384e-kube-api-access-jvng2\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.631033 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/964956b3-18e0-4894-b0c0-960a6770384e-service-ca-bundle\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.631092 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/58b6f788-e88b-4751-b06d-3fb68a316f91-metrics-certs\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.631134 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b81be073-688e-4025-8973-fd9147b4a8fa-config\") pod \"kube-apiserver-operator-766d6c64bb-mt8wt\" (UID: \"b81be073-688e-4025-8973-fd9147b4a8fa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.631177 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5101da2b-8945-460a-951e-a61632f28c98-config\") pod \"service-ca-operator-777779d784-mrdz7\" (UID: \"5101da2b-8945-460a-951e-a61632f28c98\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.631230 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-registry-tls\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.632279 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd433de1-a494-45e1-9a19-1b619fe7c3bc-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vdnsd\" (UID: \"dd433de1-a494-45e1-9a19-1b619fe7c3bc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.632409 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z668p\" (UniqueName: \"kubernetes.io/projected/fb95f0bc-df36-46d1-9c1e-39f3bdb97735-kube-api-access-z668p\") pod \"machine-approver-56656f9798-ph942\" (UID: \"fb95f0bc-df36-46d1-9c1e-39f3bdb97735\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.632524 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/01e2a53e-c853-4151-9c6c-a81917e54bf0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-tdrhc\" (UID: \"01e2a53e-c853-4151-9c6c-a81917e54bf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.632732 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c6d4d4d2-b14a-4159-9755-267ab804edb0-signing-cabundle\") pod \"service-ca-9c57cc56f-7wm6j\" (UID: \"c6d4d4d2-b14a-4159-9755-267ab804edb0\") " pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.632765 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/964956b3-18e0-4894-b0c0-960a6770384e-service-ca-bundle\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.632941 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f32960fb-d6fb-41bf-8e3c-4c26e3dd80af-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zwnjx\" (UID: \"f32960fb-d6fb-41bf-8e3c-4c26e3dd80af\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.634158 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd433de1-a494-45e1-9a19-1b619fe7c3bc-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vdnsd\" (UID: \"dd433de1-a494-45e1-9a19-1b619fe7c3bc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.634314 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m62rq\" (UniqueName: \"kubernetes.io/projected/807b336c-8d01-4644-8c75-1e9da97c0d23-kube-api-access-m62rq\") pod \"openshift-apiserver-operator-796bbdcf4f-sklbl\" (UID: \"807b336c-8d01-4644-8c75-1e9da97c0d23\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.634423 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807b336c-8d01-4644-8c75-1e9da97c0d23-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-sklbl\" (UID: \"807b336c-8d01-4644-8c75-1e9da97c0d23\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.634463 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-socket-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.635247 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.635497 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/964956b3-18e0-4894-b0c0-960a6770384e-serving-cert\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.635833 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/16b03fe5-5fd8-460b-b4a1-0011c47bf79b-profile-collector-cert\") pod \"catalog-operator-68c6474976-jgb79\" (UID: \"16b03fe5-5fd8-460b-b4a1-0011c47bf79b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.636070 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dc737f5-37f4-47a3-8716-af033cbe27fc-serving-cert\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.636161 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18323a84-6150-4a2d-bc8d-b29b4a0a7ddd-proxy-tls\") pod \"machine-config-controller-84d6567774-tdvpk\" (UID: \"18323a84-6150-4a2d-bc8d-b29b4a0a7ddd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.636795 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fb95f0bc-df36-46d1-9c1e-39f3bdb97735-machine-approver-tls\") pod \"machine-approver-56656f9798-ph942\" (UID: \"fb95f0bc-df36-46d1-9c1e-39f3bdb97735\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.638446 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.639261 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7bd2b073-6d2e-4de8-b164-f853c7e01794-etcd-client\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:48 crc kubenswrapper[4764]: W1001 16:04:48.639288 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e81f7ca_2bc8_4d14_a101_e73361300228.slice/crio-27ac2b30f20e1463ad11e43e6582403c6e6ff2e1e9ebac7aebff2f91175fa2cc WatchSource:0}: Error finding container 27ac2b30f20e1463ad11e43e6582403c6e6ff2e1e9ebac7aebff2f91175fa2cc: Status 404 returned error can't find the container with id 27ac2b30f20e1463ad11e43e6582403c6e6ff2e1e9ebac7aebff2f91175fa2cc Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.639394 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aedccabf-870c-4bca-9d09-028aa2416702-serving-cert\") pod \"route-controller-manager-6576b87f9c-n5qtc\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.639769 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd433de1-a494-45e1-9a19-1b619fe7c3bc-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vdnsd\" (UID: \"dd433de1-a494-45e1-9a19-1b619fe7c3bc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.640580 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-registry-tls\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.641104 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/807b336c-8d01-4644-8c75-1e9da97c0d23-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-sklbl\" (UID: \"807b336c-8d01-4644-8c75-1e9da97c0d23\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.647342 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.667381 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.671308 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6bg4z"] Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.687783 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.707606 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 01 16:04:48 crc kubenswrapper[4764]: W1001 16:04:48.713852 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7afea08_2815_437c_b5ce_26e40f80edda.slice/crio-aa8fb908ca550464b02efff6584082a842c2219c33add539e14dc87989020dd7 WatchSource:0}: Error finding container aa8fb908ca550464b02efff6584082a842c2219c33add539e14dc87989020dd7: Status 404 returned error can't find the container with id aa8fb908ca550464b02efff6584082a842c2219c33add539e14dc87989020dd7 Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.726977 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.736965 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:48 crc kubenswrapper[4764]: E1001 16:04:48.737155 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.237126913 +0000 UTC m=+152.236773748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.737249 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkqg5\" (UniqueName: \"kubernetes.io/projected/01e2a53e-c853-4151-9c6c-a81917e54bf0-kube-api-access-fkqg5\") pod \"ingress-operator-5b745b69d9-tdrhc\" (UID: \"01e2a53e-c853-4151-9c6c-a81917e54bf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.737282 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5lr7\" (UniqueName: \"kubernetes.io/projected/25878fb0-4dc8-47aa-b15d-ad43c06319a3-kube-api-access-t5lr7\") pod \"machine-config-server-jcsqb\" (UID: \"25878fb0-4dc8-47aa-b15d-ad43c06319a3\") " pod="openshift-machine-config-operator/machine-config-server-jcsqb" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.737839 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-mountpoint-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.737865 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snwcn\" (UniqueName: \"kubernetes.io/projected/4cc85075-9ddf-47e1-9564-615f6f5f26ed-kube-api-access-snwcn\") pod \"ingress-canary-dp2zw\" (UID: \"4cc85075-9ddf-47e1-9564-615f6f5f26ed\") " pod="openshift-ingress-canary/ingress-canary-dp2zw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.737887 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-registration-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.737906 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5101da2b-8945-460a-951e-a61632f28c98-serving-cert\") pod \"service-ca-operator-777779d784-mrdz7\" (UID: \"5101da2b-8945-460a-951e-a61632f28c98\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.737926 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fdbf81ba-852e-4222-8ed8-efd4dbd27a80-tmpfs\") pod \"packageserver-d55dfcdfc-qb64g\" (UID: \"fdbf81ba-852e-4222-8ed8-efd4dbd27a80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.737931 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-mountpoint-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.737944 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsxrs\" (UniqueName: \"kubernetes.io/projected/f32960fb-d6fb-41bf-8e3c-4c26e3dd80af-kube-api-access-rsxrs\") pod \"package-server-manager-789f6589d5-zwnjx\" (UID: \"f32960fb-d6fb-41bf-8e3c-4c26e3dd80af\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.737993 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8qwk\" (UniqueName: \"kubernetes.io/projected/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-kube-api-access-q8qwk\") pod \"marketplace-operator-79b997595-cd5qk\" (UID: \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738012 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15522425-a48b-41f0-8327-8ce597b09dd2-proxy-tls\") pod \"machine-config-operator-74547568cd-cf72j\" (UID: \"15522425-a48b-41f0-8327-8ce597b09dd2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738029 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b81be073-688e-4025-8973-fd9147b4a8fa-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mt8wt\" (UID: \"b81be073-688e-4025-8973-fd9147b4a8fa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738083 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/58b6f788-e88b-4751-b06d-3fb68a316f91-default-certificate\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738101 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xpzw\" (UniqueName: \"kubernetes.io/projected/18323a84-6150-4a2d-bc8d-b29b4a0a7ddd-kube-api-access-7xpzw\") pod \"machine-config-controller-84d6567774-tdvpk\" (UID: \"18323a84-6150-4a2d-bc8d-b29b4a0a7ddd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738118 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-secret-volume\") pod \"collect-profiles-29322240-f5k2s\" (UID: \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738157 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c6d4d4d2-b14a-4159-9755-267ab804edb0-signing-key\") pod \"service-ca-9c57cc56f-7wm6j\" (UID: \"c6d4d4d2-b14a-4159-9755-267ab804edb0\") " pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738183 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01e2a53e-c853-4151-9c6c-a81917e54bf0-metrics-tls\") pod \"ingress-operator-5b745b69d9-tdrhc\" (UID: \"01e2a53e-c853-4151-9c6c-a81917e54bf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738220 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lswq2\" (UniqueName: \"kubernetes.io/projected/eccae5ea-5d95-4d65-97cd-9d8ee4db20bc-kube-api-access-lswq2\") pod \"control-plane-machine-set-operator-78cbb6b69f-lntzx\" (UID: \"eccae5ea-5d95-4d65-97cd-9d8ee4db20bc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738240 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/25878fb0-4dc8-47aa-b15d-ad43c06319a3-certs\") pod \"machine-config-server-jcsqb\" (UID: \"25878fb0-4dc8-47aa-b15d-ad43c06319a3\") " pod="openshift-machine-config-operator/machine-config-server-jcsqb" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738256 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c199b404-aae6-4d71-8aab-022be036624d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sv6hf\" (UID: \"c199b404-aae6-4d71-8aab-022be036624d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738319 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01e2a53e-c853-4151-9c6c-a81917e54bf0-trusted-ca\") pod \"ingress-operator-5b745b69d9-tdrhc\" (UID: \"01e2a53e-c853-4151-9c6c-a81917e54bf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738332 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-registration-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738339 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j65x\" (UniqueName: \"kubernetes.io/projected/c9b4fd74-b139-4155-920d-139eeda695d5-kube-api-access-5j65x\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738389 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9b4fd74-b139-4155-920d-139eeda695d5-etcd-service-ca\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738407 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-config-volume\") pod \"collect-profiles-29322240-f5k2s\" (UID: \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738425 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/25878fb0-4dc8-47aa-b15d-ad43c06319a3-node-bootstrap-token\") pod \"machine-config-server-jcsqb\" (UID: \"25878fb0-4dc8-47aa-b15d-ad43c06319a3\") " pod="openshift-machine-config-operator/machine-config-server-jcsqb" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738428 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fdbf81ba-852e-4222-8ed8-efd4dbd27a80-tmpfs\") pod \"packageserver-d55dfcdfc-qb64g\" (UID: \"fdbf81ba-852e-4222-8ed8-efd4dbd27a80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738445 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/15522425-a48b-41f0-8327-8ce597b09dd2-images\") pod \"machine-config-operator-74547568cd-cf72j\" (UID: \"15522425-a48b-41f0-8327-8ce597b09dd2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738476 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-plugins-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-csi-data-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738520 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p55hq\" (UniqueName: \"kubernetes.io/projected/c6d4d4d2-b14a-4159-9755-267ab804edb0-kube-api-access-p55hq\") pod \"service-ca-9c57cc56f-7wm6j\" (UID: \"c6d4d4d2-b14a-4159-9755-267ab804edb0\") " pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738536 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/15522425-a48b-41f0-8327-8ce597b09dd2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-cf72j\" (UID: \"15522425-a48b-41f0-8327-8ce597b09dd2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738559 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2729\" (UniqueName: \"kubernetes.io/projected/fdbf81ba-852e-4222-8ed8-efd4dbd27a80-kube-api-access-n2729\") pod \"packageserver-d55dfcdfc-qb64g\" (UID: \"fdbf81ba-852e-4222-8ed8-efd4dbd27a80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738575 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d333f596-88f5-4641-b577-fd416e45c25d-config-volume\") pod \"dns-default-6kz6b\" (UID: \"d333f596-88f5-4641-b577-fd416e45c25d\") " pod="openshift-dns/dns-default-6kz6b" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738596 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84x4w\" (UniqueName: \"kubernetes.io/projected/5101da2b-8945-460a-951e-a61632f28c98-kube-api-access-84x4w\") pod \"service-ca-operator-777779d784-mrdz7\" (UID: \"5101da2b-8945-460a-951e-a61632f28c98\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738612 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-plugins-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738622 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcwm4\" (UniqueName: \"kubernetes.io/projected/16b03fe5-5fd8-460b-b4a1-0011c47bf79b-kube-api-access-lcwm4\") pod \"catalog-operator-68c6474976-jgb79\" (UID: \"16b03fe5-5fd8-460b-b4a1-0011c47bf79b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738645 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/eccae5ea-5d95-4d65-97cd-9d8ee4db20bc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lntzx\" (UID: \"eccae5ea-5d95-4d65-97cd-9d8ee4db20bc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738672 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74c0b14c-2e43-4b8c-8bba-eb597ffc13ce-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-d9h6p\" (UID: \"74c0b14c-2e43-4b8c-8bba-eb597ffc13ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738687 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c0b14c-2e43-4b8c-8bba-eb597ffc13ce-config\") pod \"kube-controller-manager-operator-78b949d7b-d9h6p\" (UID: \"74c0b14c-2e43-4b8c-8bba-eb597ffc13ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738711 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzqv6\" (UniqueName: \"kubernetes.io/projected/ca157967-0878-4ffe-a609-6c4be43a9ee0-kube-api-access-lzqv6\") pod \"multus-admission-controller-857f4d67dd-2gkmm\" (UID: \"ca157967-0878-4ffe-a609-6c4be43a9ee0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2gkmm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738786 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b81be073-688e-4025-8973-fd9147b4a8fa-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mt8wt\" (UID: \"b81be073-688e-4025-8973-fd9147b4a8fa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738817 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw2cl\" (UniqueName: \"kubernetes.io/projected/972c24b8-8c37-4214-b0f0-6449046a3eca-kube-api-access-rw2cl\") pod \"kube-storage-version-migrator-operator-b67b599dd-4tcjp\" (UID: \"972c24b8-8c37-4214-b0f0-6449046a3eca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738856 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738883 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b4fd74-b139-4155-920d-139eeda695d5-config\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738904 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d333f596-88f5-4641-b577-fd416e45c25d-metrics-tls\") pod \"dns-default-6kz6b\" (UID: \"d333f596-88f5-4641-b577-fd416e45c25d\") " pod="openshift-dns/dns-default-6kz6b" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738927 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-554wf\" (UniqueName: \"kubernetes.io/projected/83c8dab0-d64e-45be-b358-0baaaf1eca60-kube-api-access-554wf\") pod \"olm-operator-6b444d44fb-bzmh9\" (UID: \"83c8dab0-d64e-45be-b358-0baaaf1eca60\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738950 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c9b4fd74-b139-4155-920d-139eeda695d5-etcd-client\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738971 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqlkv\" (UniqueName: \"kubernetes.io/projected/98d6aa4d-7d9b-4b28-82a8-75be1d101b6c-kube-api-access-sqlkv\") pod \"dns-operator-744455d44c-29nrh\" (UID: \"98d6aa4d-7d9b-4b28-82a8-75be1d101b6c\") " pod="openshift-dns-operator/dns-operator-744455d44c-29nrh" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.738990 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8bln\" (UniqueName: \"kubernetes.io/projected/fb9458af-3147-4e40-8816-2d50eeaee101-kube-api-access-c8bln\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739013 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cc85075-9ddf-47e1-9564-615f6f5f26ed-cert\") pod \"ingress-canary-dp2zw\" (UID: \"4cc85075-9ddf-47e1-9564-615f6f5f26ed\") " pod="openshift-ingress-canary/ingress-canary-dp2zw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739036 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/18323a84-6150-4a2d-bc8d-b29b4a0a7ddd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tdvpk\" (UID: \"18323a84-6150-4a2d-bc8d-b29b4a0a7ddd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739075 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qclhk\" (UniqueName: \"kubernetes.io/projected/58b6f788-e88b-4751-b06d-3fb68a316f91-kube-api-access-qclhk\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739101 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/58b6f788-e88b-4751-b06d-3fb68a316f91-stats-auth\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739125 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7br74\" (UniqueName: \"kubernetes.io/projected/15522425-a48b-41f0-8327-8ce597b09dd2-kube-api-access-7br74\") pod \"machine-config-operator-74547568cd-cf72j\" (UID: \"15522425-a48b-41f0-8327-8ce597b09dd2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739147 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fdbf81ba-852e-4222-8ed8-efd4dbd27a80-webhook-cert\") pod \"packageserver-d55dfcdfc-qb64g\" (UID: \"fdbf81ba-852e-4222-8ed8-efd4dbd27a80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739167 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/16b03fe5-5fd8-460b-b4a1-0011c47bf79b-srv-cert\") pod \"catalog-operator-68c6474976-jgb79\" (UID: \"16b03fe5-5fd8-460b-b4a1-0011c47bf79b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739188 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c9b4fd74-b139-4155-920d-139eeda695d5-etcd-ca\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739209 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74c0b14c-2e43-4b8c-8bba-eb597ffc13ce-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-d9h6p\" (UID: \"74c0b14c-2e43-4b8c-8bba-eb597ffc13ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739235 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9b4fd74-b139-4155-920d-139eeda695d5-serving-cert\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739263 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cd5qk\" (UID: \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739283 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c9b4fd74-b139-4155-920d-139eeda695d5-etcd-service-ca\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739287 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/15522425-a48b-41f0-8327-8ce597b09dd2-images\") pod \"machine-config-operator-74547568cd-cf72j\" (UID: \"15522425-a48b-41f0-8327-8ce597b09dd2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739292 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cd5qk\" (UID: \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739363 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c199b404-aae6-4d71-8aab-022be036624d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sv6hf\" (UID: \"c199b404-aae6-4d71-8aab-022be036624d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739406 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/58b6f788-e88b-4751-b06d-3fb68a316f91-metrics-certs\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739423 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b81be073-688e-4025-8973-fd9147b4a8fa-config\") pod \"kube-apiserver-operator-766d6c64bb-mt8wt\" (UID: \"b81be073-688e-4025-8973-fd9147b4a8fa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739444 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-csi-data-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.739453 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5101da2b-8945-460a-951e-a61632f28c98-config\") pod \"service-ca-operator-777779d784-mrdz7\" (UID: \"5101da2b-8945-460a-951e-a61632f28c98\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740164 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/01e2a53e-c853-4151-9c6c-a81917e54bf0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-tdrhc\" (UID: \"01e2a53e-c853-4151-9c6c-a81917e54bf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c6d4d4d2-b14a-4159-9755-267ab804edb0-signing-cabundle\") pod \"service-ca-9c57cc56f-7wm6j\" (UID: \"c6d4d4d2-b14a-4159-9755-267ab804edb0\") " pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740210 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f32960fb-d6fb-41bf-8e3c-4c26e3dd80af-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zwnjx\" (UID: \"f32960fb-d6fb-41bf-8e3c-4c26e3dd80af\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740237 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-socket-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/16b03fe5-5fd8-460b-b4a1-0011c47bf79b-profile-collector-cert\") pod \"catalog-operator-68c6474976-jgb79\" (UID: \"16b03fe5-5fd8-460b-b4a1-0011c47bf79b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740276 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18323a84-6150-4a2d-bc8d-b29b4a0a7ddd-proxy-tls\") pod \"machine-config-controller-84d6567774-tdvpk\" (UID: \"18323a84-6150-4a2d-bc8d-b29b4a0a7ddd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740307 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/972c24b8-8c37-4214-b0f0-6449046a3eca-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4tcjp\" (UID: \"972c24b8-8c37-4214-b0f0-6449046a3eca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740324 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c199b404-aae6-4d71-8aab-022be036624d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sv6hf\" (UID: \"c199b404-aae6-4d71-8aab-022be036624d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdt5h\" (UniqueName: \"kubernetes.io/projected/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-kube-api-access-qdt5h\") pod \"collect-profiles-29322240-f5k2s\" (UID: \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fdbf81ba-852e-4222-8ed8-efd4dbd27a80-apiservice-cert\") pod \"packageserver-d55dfcdfc-qb64g\" (UID: \"fdbf81ba-852e-4222-8ed8-efd4dbd27a80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740383 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wwfw\" (UniqueName: \"kubernetes.io/projected/c04e7251-fb9c-4ba7-b06b-7c058c6b0859-kube-api-access-5wwfw\") pod \"migrator-59844c95c7-r9bvz\" (UID: \"c04e7251-fb9c-4ba7-b06b-7c058c6b0859\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r9bvz" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740417 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br89k\" (UniqueName: \"kubernetes.io/projected/d333f596-88f5-4641-b577-fd416e45c25d-kube-api-access-br89k\") pod \"dns-default-6kz6b\" (UID: \"d333f596-88f5-4641-b577-fd416e45c25d\") " pod="openshift-dns/dns-default-6kz6b" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740438 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c0b14c-2e43-4b8c-8bba-eb597ffc13ce-config\") pod \"kube-controller-manager-operator-78b949d7b-d9h6p\" (UID: \"74c0b14c-2e43-4b8c-8bba-eb597ffc13ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740452 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58b6f788-e88b-4751-b06d-3fb68a316f91-service-ca-bundle\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740441 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/15522425-a48b-41f0-8327-8ce597b09dd2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-cf72j\" (UID: \"15522425-a48b-41f0-8327-8ce597b09dd2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ca157967-0878-4ffe-a609-6c4be43a9ee0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2gkmm\" (UID: \"ca157967-0878-4ffe-a609-6c4be43a9ee0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2gkmm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740168 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b81be073-688e-4025-8973-fd9147b4a8fa-config\") pod \"kube-apiserver-operator-766d6c64bb-mt8wt\" (UID: \"b81be073-688e-4025-8973-fd9147b4a8fa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.740619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fb9458af-3147-4e40-8816-2d50eeaee101-socket-dir\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:48 crc kubenswrapper[4764]: E1001 16:04:48.740643 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.24062345 +0000 UTC m=+152.240270295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.741140 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/83c8dab0-d64e-45be-b358-0baaaf1eca60-srv-cert\") pod \"olm-operator-6b444d44fb-bzmh9\" (UID: \"83c8dab0-d64e-45be-b358-0baaaf1eca60\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.741191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98d6aa4d-7d9b-4b28-82a8-75be1d101b6c-metrics-tls\") pod \"dns-operator-744455d44c-29nrh\" (UID: \"98d6aa4d-7d9b-4b28-82a8-75be1d101b6c\") " pod="openshift-dns-operator/dns-operator-744455d44c-29nrh" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.741211 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c199b404-aae6-4d71-8aab-022be036624d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sv6hf\" (UID: \"c199b404-aae6-4d71-8aab-022be036624d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.741219 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/972c24b8-8c37-4214-b0f0-6449046a3eca-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4tcjp\" (UID: \"972c24b8-8c37-4214-b0f0-6449046a3eca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.741255 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/01e2a53e-c853-4151-9c6c-a81917e54bf0-trusted-ca\") pod \"ingress-operator-5b745b69d9-tdrhc\" (UID: \"01e2a53e-c853-4151-9c6c-a81917e54bf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.741288 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/83c8dab0-d64e-45be-b358-0baaaf1eca60-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bzmh9\" (UID: \"83c8dab0-d64e-45be-b358-0baaaf1eca60\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.741629 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9b4fd74-b139-4155-920d-139eeda695d5-config\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.741890 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/972c24b8-8c37-4214-b0f0-6449046a3eca-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4tcjp\" (UID: \"972c24b8-8c37-4214-b0f0-6449046a3eca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.742923 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/18323a84-6150-4a2d-bc8d-b29b4a0a7ddd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tdvpk\" (UID: \"18323a84-6150-4a2d-bc8d-b29b4a0a7ddd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.743595 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c9b4fd74-b139-4155-920d-139eeda695d5-etcd-ca\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.743803 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58b6f788-e88b-4751-b06d-3fb68a316f91-service-ca-bundle\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.744691 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cd5qk\" (UID: \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.746502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b81be073-688e-4025-8973-fd9147b4a8fa-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mt8wt\" (UID: \"b81be073-688e-4025-8973-fd9147b4a8fa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.746540 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c199b404-aae6-4d71-8aab-022be036624d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sv6hf\" (UID: \"c199b404-aae6-4d71-8aab-022be036624d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.746540 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9b4fd74-b139-4155-920d-139eeda695d5-serving-cert\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.746601 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c6d4d4d2-b14a-4159-9755-267ab804edb0-signing-cabundle\") pod \"service-ca-9c57cc56f-7wm6j\" (UID: \"c6d4d4d2-b14a-4159-9755-267ab804edb0\") " pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.746755 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/25878fb0-4dc8-47aa-b15d-ad43c06319a3-node-bootstrap-token\") pod \"machine-config-server-jcsqb\" (UID: \"25878fb0-4dc8-47aa-b15d-ad43c06319a3\") " pod="openshift-machine-config-operator/machine-config-server-jcsqb" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.746991 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15522425-a48b-41f0-8327-8ce597b09dd2-proxy-tls\") pod \"machine-config-operator-74547568cd-cf72j\" (UID: \"15522425-a48b-41f0-8327-8ce597b09dd2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.747180 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/25878fb0-4dc8-47aa-b15d-ad43c06319a3-certs\") pod \"machine-config-server-jcsqb\" (UID: \"25878fb0-4dc8-47aa-b15d-ad43c06319a3\") " pod="openshift-machine-config-operator/machine-config-server-jcsqb" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.747195 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c6d4d4d2-b14a-4159-9755-267ab804edb0-signing-key\") pod \"service-ca-9c57cc56f-7wm6j\" (UID: \"c6d4d4d2-b14a-4159-9755-267ab804edb0\") " pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.747186 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/58b6f788-e88b-4751-b06d-3fb68a316f91-default-certificate\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.747514 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.747559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01e2a53e-c853-4151-9c6c-a81917e54bf0-metrics-tls\") pod \"ingress-operator-5b745b69d9-tdrhc\" (UID: \"01e2a53e-c853-4151-9c6c-a81917e54bf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.748225 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c9b4fd74-b139-4155-920d-139eeda695d5-etcd-client\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.748958 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/83c8dab0-d64e-45be-b358-0baaaf1eca60-srv-cert\") pod \"olm-operator-6b444d44fb-bzmh9\" (UID: \"83c8dab0-d64e-45be-b358-0baaaf1eca60\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.749462 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/972c24b8-8c37-4214-b0f0-6449046a3eca-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4tcjp\" (UID: \"972c24b8-8c37-4214-b0f0-6449046a3eca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.749579 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/58b6f788-e88b-4751-b06d-3fb68a316f91-metrics-certs\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.749603 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18323a84-6150-4a2d-bc8d-b29b4a0a7ddd-proxy-tls\") pod \"machine-config-controller-84d6567774-tdvpk\" (UID: \"18323a84-6150-4a2d-bc8d-b29b4a0a7ddd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.749888 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/58b6f788-e88b-4751-b06d-3fb68a316f91-stats-auth\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.750489 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98d6aa4d-7d9b-4b28-82a8-75be1d101b6c-metrics-tls\") pod \"dns-operator-744455d44c-29nrh\" (UID: \"98d6aa4d-7d9b-4b28-82a8-75be1d101b6c\") " pod="openshift-dns-operator/dns-operator-744455d44c-29nrh" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.751020 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74c0b14c-2e43-4b8c-8bba-eb597ffc13ce-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-d9h6p\" (UID: \"74c0b14c-2e43-4b8c-8bba-eb597ffc13ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.751864 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cd5qk\" (UID: \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.752158 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-secret-volume\") pod \"collect-profiles-29322240-f5k2s\" (UID: \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.753903 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/83c8dab0-d64e-45be-b358-0baaaf1eca60-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bzmh9\" (UID: \"83c8dab0-d64e-45be-b358-0baaaf1eca60\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.759253 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/16b03fe5-5fd8-460b-b4a1-0011c47bf79b-profile-collector-cert\") pod \"catalog-operator-68c6474976-jgb79\" (UID: \"16b03fe5-5fd8-460b-b4a1-0011c47bf79b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.769225 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.787345 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.807542 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.826771 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.840999 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ca157967-0878-4ffe-a609-6c4be43a9ee0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2gkmm\" (UID: \"ca157967-0878-4ffe-a609-6c4be43a9ee0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2gkmm" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.845117 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:48 crc kubenswrapper[4764]: E1001 16:04:48.845959 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.345937297 +0000 UTC m=+152.345584132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.847356 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.867812 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.887272 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.891018 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5101da2b-8945-460a-951e-a61632f28c98-serving-cert\") pod \"service-ca-operator-777779d784-mrdz7\" (UID: \"5101da2b-8945-460a-951e-a61632f28c98\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.907921 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.911372 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5101da2b-8945-460a-951e-a61632f28c98-config\") pod \"service-ca-operator-777779d784-mrdz7\" (UID: \"5101da2b-8945-460a-951e-a61632f28c98\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.926969 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.946984 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.947651 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:48 crc kubenswrapper[4764]: E1001 16:04:48.948138 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.448117095 +0000 UTC m=+152.447763930 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.954482 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/16b03fe5-5fd8-460b-b4a1-0011c47bf79b-srv-cert\") pod \"catalog-operator-68c6474976-jgb79\" (UID: \"16b03fe5-5fd8-460b-b4a1-0011c47bf79b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.967448 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.972396 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/eccae5ea-5d95-4d65-97cd-9d8ee4db20bc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lntzx\" (UID: \"eccae5ea-5d95-4d65-97cd-9d8ee4db20bc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx" Oct 01 16:04:48 crc kubenswrapper[4764]: I1001 16:04:48.987509 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.007713 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.014848 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fdbf81ba-852e-4222-8ed8-efd4dbd27a80-apiservice-cert\") pod \"packageserver-d55dfcdfc-qb64g\" (UID: \"fdbf81ba-852e-4222-8ed8-efd4dbd27a80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.014847 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fdbf81ba-852e-4222-8ed8-efd4dbd27a80-webhook-cert\") pod \"packageserver-d55dfcdfc-qb64g\" (UID: \"fdbf81ba-852e-4222-8ed8-efd4dbd27a80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.027262 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.030124 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-config-volume\") pod \"collect-profiles-29322240-f5k2s\" (UID: \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.047877 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.050303 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.050559 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.550524448 +0000 UTC m=+152.550171293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.050989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.051509 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.551496941 +0000 UTC m=+152.551143776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.066958 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.077206 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f32960fb-d6fb-41bf-8e3c-4c26e3dd80af-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zwnjx\" (UID: \"f32960fb-d6fb-41bf-8e3c-4c26e3dd80af\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.108800 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.129285 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.137871 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d333f596-88f5-4641-b577-fd416e45c25d-metrics-tls\") pod \"dns-default-6kz6b\" (UID: \"d333f596-88f5-4641-b577-fd416e45c25d\") " pod="openshift-dns/dns-default-6kz6b" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.147135 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.150696 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d333f596-88f5-4641-b577-fd416e45c25d-config-volume\") pod \"dns-default-6kz6b\" (UID: \"d333f596-88f5-4641-b577-fd416e45c25d\") " pod="openshift-dns/dns-default-6kz6b" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.152857 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.153979 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.653960667 +0000 UTC m=+152.653607512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.168007 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.188613 4764 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.208932 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.228067 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.247892 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.254844 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.255263 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.755242951 +0000 UTC m=+152.754889796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.257698 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cc85075-9ddf-47e1-9564-615f6f5f26ed-cert\") pod \"ingress-canary-dp2zw\" (UID: \"4cc85075-9ddf-47e1-9564-615f6f5f26ed\") " pod="openshift-ingress-canary/ingress-canary-dp2zw" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.267827 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.287659 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.322633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54tbr\" (UniqueName: \"kubernetes.io/projected/1dc737f5-37f4-47a3-8716-af033cbe27fc-kube-api-access-54tbr\") pod \"controller-manager-879f6c89f-6j7l4\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.343137 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-586jt\" (UniqueName: \"kubernetes.io/projected/aedccabf-870c-4bca-9d09-028aa2416702-kube-api-access-586jt\") pod \"route-controller-manager-6576b87f9c-n5qtc\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.356201 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.356379 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.856356632 +0000 UTC m=+152.856003467 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.356861 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.357148 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.857140062 +0000 UTC m=+152.856786897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.360758 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-bound-sa-token\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.381502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpj9c\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-kube-api-access-qpj9c\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.402532 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd6rs\" (UniqueName: \"kubernetes.io/projected/7bd2b073-6d2e-4de8-b164-f853c7e01794-kube-api-access-qd6rs\") pod \"apiserver-7bbb656c7d-frl64\" (UID: \"7bd2b073-6d2e-4de8-b164-f853c7e01794\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.423117 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd433de1-a494-45e1-9a19-1b619fe7c3bc-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vdnsd\" (UID: \"dd433de1-a494-45e1-9a19-1b619fe7c3bc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.426643 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" event={"ID":"9cc0e821-77ea-4840-be3d-1165904bf50d","Type":"ContainerStarted","Data":"f693810b515f1fe6b8992ac37ecedaf5ba563074cb49d9e554867336f4de894f"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.426901 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.429157 4764 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-hn4rz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.429204 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" podUID="9cc0e821-77ea-4840-be3d-1165904bf50d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.429254 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p" event={"ID":"a1d95364-40e4-46a6-a2de-3a94a8cda31e","Type":"ContainerStarted","Data":"92192325d15a29bd02a5bedaf3ccd8fbcb3b81e18dd2e7b12078d080f9b927ed"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.429324 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p" event={"ID":"a1d95364-40e4-46a6-a2de-3a94a8cda31e","Type":"ContainerStarted","Data":"b1ef640f0f7b5b8195a6aad7d945a420e10548bd16f1a8da35b31964af22b489"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.429347 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p" event={"ID":"a1d95364-40e4-46a6-a2de-3a94a8cda31e","Type":"ContainerStarted","Data":"3e0aa37efaf6a0f7c4d19de277cb4ffee9d74b457c8e8794246a930c5667290a"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.431436 4764 generic.go:334] "Generic (PLEG): container finished" podID="c7afea08-2815-437c-b5ce-26e40f80edda" containerID="668d941ba1caf9117335b19128baee54f51fb2dec1812cc00a4d06a019386e78" exitCode=0 Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.431673 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" event={"ID":"c7afea08-2815-437c-b5ce-26e40f80edda","Type":"ContainerDied","Data":"668d941ba1caf9117335b19128baee54f51fb2dec1812cc00a4d06a019386e78"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.431705 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" event={"ID":"c7afea08-2815-437c-b5ce-26e40f80edda","Type":"ContainerStarted","Data":"aa8fb908ca550464b02efff6584082a842c2219c33add539e14dc87989020dd7"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.434528 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-927gg" event={"ID":"54bc06f9-fb4d-496c-af11-0d4acf39f27b","Type":"ContainerStarted","Data":"c0dcb10e9e4facd98916e848752013659c35ca951b77aa5bf2fd738c81ecfe18"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.434569 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.434582 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-927gg" event={"ID":"54bc06f9-fb4d-496c-af11-0d4acf39f27b","Type":"ContainerStarted","Data":"593ebc9757f2a7867a3590b2b96e9f5d758d8cdaeee23243236fcd0622c0d5e4"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.437358 4764 patch_prober.go:28] interesting pod/console-operator-58897d9998-927gg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.437422 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-927gg" podUID="54bc06f9-fb4d-496c-af11-0d4acf39f27b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.439463 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" event={"ID":"1e81f7ca-2bc8-4d14-a101-e73361300228","Type":"ContainerStarted","Data":"c3e085d16091e30fd50bdb01643788c34c6aa94272f0f7314111a00cc1285c29"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.439512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" event={"ID":"1e81f7ca-2bc8-4d14-a101-e73361300228","Type":"ContainerStarted","Data":"07f78ca5eec4a4f66395ec30a9f05537a67a838904f03341caecc19ca6d1b951"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.439525 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" event={"ID":"1e81f7ca-2bc8-4d14-a101-e73361300228","Type":"ContainerStarted","Data":"27ac2b30f20e1463ad11e43e6582403c6e6ff2e1e9ebac7aebff2f91175fa2cc"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.441423 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pfzm8" event={"ID":"35ad23c6-6d86-4e4f-b642-336f47fe999c","Type":"ContainerStarted","Data":"b2e6c9d7bca102439f0a3bd44df64ea7e08c70902616d3befcbd823b7a3ad55a"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.441615 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pfzm8" event={"ID":"35ad23c6-6d86-4e4f-b642-336f47fe999c","Type":"ContainerStarted","Data":"cbe3f3eb334fefead5e7a40cb6e06c52cd12a66c3e36343b0362fe84a3deac4d"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.443712 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" event={"ID":"d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53","Type":"ContainerStarted","Data":"c636316d90c8044f3c4c4ed0738e28846d56553973aeab1bb5c21bf7d8783ba4"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.443738 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" event={"ID":"d0dcc1bb-7a8b-4ff4-9b57-1f366baf7f53","Type":"ContainerStarted","Data":"e67a0e307602ac84a9cafb51086241d27010edee7fa6a9805a7675ec263198d7"} Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.448302 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mphl7\" (UniqueName: \"kubernetes.io/projected/dd433de1-a494-45e1-9a19-1b619fe7c3bc-kube-api-access-mphl7\") pod \"cluster-image-registry-operator-dc59b4c8b-vdnsd\" (UID: \"dd433de1-a494-45e1-9a19-1b619fe7c3bc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.457993 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.458432 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.958411977 +0000 UTC m=+152.958058812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.458552 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.458855 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:49.958839347 +0000 UTC m=+152.958486182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.460011 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkdbw\" (UniqueName: \"kubernetes.io/projected/bd701ac0-1755-4925-97c5-d0def443d990-kube-api-access-rkdbw\") pod \"openshift-config-operator-7777fb866f-nmtgg\" (UID: \"bd701ac0-1755-4925-97c5-d0def443d990\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.460201 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.481330 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdr4k\" (UniqueName: \"kubernetes.io/projected/ce380ecc-2685-4ceb-85f6-617c8f7c0eaa-kube-api-access-xdr4k\") pod \"downloads-7954f5f757-5trjp\" (UID: \"ce380ecc-2685-4ceb-85f6-617c8f7c0eaa\") " pod="openshift-console/downloads-7954f5f757-5trjp" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.504334 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.506136 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvng2\" (UniqueName: \"kubernetes.io/projected/964956b3-18e0-4894-b0c0-960a6770384e-kube-api-access-jvng2\") pod \"authentication-operator-69f744f599-wc7z5\" (UID: \"964956b3-18e0-4894-b0c0-960a6770384e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.523930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z668p\" (UniqueName: \"kubernetes.io/projected/fb95f0bc-df36-46d1-9c1e-39f3bdb97735-kube-api-access-z668p\") pod \"machine-approver-56656f9798-ph942\" (UID: \"fb95f0bc-df36-46d1-9c1e-39f3bdb97735\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.541008 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.547673 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m62rq\" (UniqueName: \"kubernetes.io/projected/807b336c-8d01-4644-8c75-1e9da97c0d23-kube-api-access-m62rq\") pod \"openshift-apiserver-operator-796bbdcf4f-sklbl\" (UID: \"807b336c-8d01-4644-8c75-1e9da97c0d23\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.568181 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.568839 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:50.06880601 +0000 UTC m=+153.068452855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.572930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkqg5\" (UniqueName: \"kubernetes.io/projected/01e2a53e-c853-4151-9c6c-a81917e54bf0-kube-api-access-fkqg5\") pod \"ingress-operator-5b745b69d9-tdrhc\" (UID: \"01e2a53e-c853-4151-9c6c-a81917e54bf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.579713 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5trjp" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.603210 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5lr7\" (UniqueName: \"kubernetes.io/projected/25878fb0-4dc8-47aa-b15d-ad43c06319a3-kube-api-access-t5lr7\") pod \"machine-config-server-jcsqb\" (UID: \"25878fb0-4dc8-47aa-b15d-ad43c06319a3\") " pod="openshift-machine-config-operator/machine-config-server-jcsqb" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.607245 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.618841 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snwcn\" (UniqueName: \"kubernetes.io/projected/4cc85075-9ddf-47e1-9564-615f6f5f26ed-kube-api-access-snwcn\") pod \"ingress-canary-dp2zw\" (UID: \"4cc85075-9ddf-47e1-9564-615f6f5f26ed\") " pod="openshift-ingress-canary/ingress-canary-dp2zw" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.636813 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.637660 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8qwk\" (UniqueName: \"kubernetes.io/projected/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-kube-api-access-q8qwk\") pod \"marketplace-operator-79b997595-cd5qk\" (UID: \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\") " pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.647897 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsxrs\" (UniqueName: \"kubernetes.io/projected/f32960fb-d6fb-41bf-8e3c-4c26e3dd80af-kube-api-access-rsxrs\") pod \"package-server-manager-789f6589d5-zwnjx\" (UID: \"f32960fb-d6fb-41bf-8e3c-4c26e3dd80af\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.666822 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xpzw\" (UniqueName: \"kubernetes.io/projected/18323a84-6150-4a2d-bc8d-b29b4a0a7ddd-kube-api-access-7xpzw\") pod \"machine-config-controller-84d6567774-tdvpk\" (UID: \"18323a84-6150-4a2d-bc8d-b29b4a0a7ddd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.667408 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.670918 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.671289 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:50.171275484 +0000 UTC m=+153.170922319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.687767 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j65x\" (UniqueName: \"kubernetes.io/projected/c9b4fd74-b139-4155-920d-139eeda695d5-kube-api-access-5j65x\") pod \"etcd-operator-b45778765-t47fw\" (UID: \"c9b4fd74-b139-4155-920d-139eeda695d5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.687964 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jcsqb" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.709613 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2729\" (UniqueName: \"kubernetes.io/projected/fdbf81ba-852e-4222-8ed8-efd4dbd27a80-kube-api-access-n2729\") pod \"packageserver-d55dfcdfc-qb64g\" (UID: \"fdbf81ba-852e-4222-8ed8-efd4dbd27a80\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.725165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84x4w\" (UniqueName: \"kubernetes.io/projected/5101da2b-8945-460a-951e-a61632f28c98-kube-api-access-84x4w\") pod \"service-ca-operator-777779d784-mrdz7\" (UID: \"5101da2b-8945-460a-951e-a61632f28c98\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.734277 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64"] Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.737018 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.741962 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p55hq\" (UniqueName: \"kubernetes.io/projected/c6d4d4d2-b14a-4159-9755-267ab804edb0-kube-api-access-p55hq\") pod \"service-ca-9c57cc56f-7wm6j\" (UID: \"c6d4d4d2-b14a-4159-9755-267ab804edb0\") " pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.745889 4764 request.go:700] Waited for 1.006215994s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/serviceaccounts/olm-operator-serviceaccount/token Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.767239 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcwm4\" (UniqueName: \"kubernetes.io/projected/16b03fe5-5fd8-460b-b4a1-0011c47bf79b-kube-api-access-lcwm4\") pod \"catalog-operator-68c6474976-jgb79\" (UID: \"16b03fe5-5fd8-460b-b4a1-0011c47bf79b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" Oct 01 16:04:49 crc kubenswrapper[4764]: W1001 16:04:49.770273 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bd2b073_6d2e_4de8_b164_f853c7e01794.slice/crio-58b4776ec9364e60ed1dfa99206ae4100b40d456b9b3e2af17929a3343166088 WatchSource:0}: Error finding container 58b4776ec9364e60ed1dfa99206ae4100b40d456b9b3e2af17929a3343166088: Status 404 returned error can't find the container with id 58b4776ec9364e60ed1dfa99206ae4100b40d456b9b3e2af17929a3343166088 Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.770648 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.771635 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.772056 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:50.272026036 +0000 UTC m=+153.271672871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.772169 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.772689 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:50.272673263 +0000 UTC m=+153.272320098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.784026 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.788429 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.788951 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b81be073-688e-4025-8973-fd9147b4a8fa-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mt8wt\" (UID: \"b81be073-688e-4025-8973-fd9147b4a8fa\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.789261 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.801351 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg"] Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.818058 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8bln\" (UniqueName: \"kubernetes.io/projected/fb9458af-3147-4e40-8816-2d50eeaee101-kube-api-access-c8bln\") pod \"csi-hostpathplugin-wcpvk\" (UID: \"fb9458af-3147-4e40-8816-2d50eeaee101\") " pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.825121 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.827572 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-554wf\" (UniqueName: \"kubernetes.io/projected/83c8dab0-d64e-45be-b358-0baaaf1eca60-kube-api-access-554wf\") pod \"olm-operator-6b444d44fb-bzmh9\" (UID: \"83c8dab0-d64e-45be-b358-0baaaf1eca60\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.827970 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dp2zw" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.849552 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzqv6\" (UniqueName: \"kubernetes.io/projected/ca157967-0878-4ffe-a609-6c4be43a9ee0-kube-api-access-lzqv6\") pod \"multus-admission-controller-857f4d67dd-2gkmm\" (UID: \"ca157967-0878-4ffe-a609-6c4be43a9ee0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2gkmm" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.854623 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd"] Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.861102 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.863264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wwfw\" (UniqueName: \"kubernetes.io/projected/c04e7251-fb9c-4ba7-b06b-7c058c6b0859-kube-api-access-5wwfw\") pod \"migrator-59844c95c7-r9bvz\" (UID: \"c04e7251-fb9c-4ba7-b06b-7c058c6b0859\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r9bvz" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.874545 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.874871 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:50.37485731 +0000 UTC m=+153.374504145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.903628 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c199b404-aae6-4d71-8aab-022be036624d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sv6hf\" (UID: \"c199b404-aae6-4d71-8aab-022be036624d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.904162 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.904759 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5trjp"] Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.907620 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qclhk\" (UniqueName: \"kubernetes.io/projected/58b6f788-e88b-4751-b06d-3fb68a316f91-kube-api-access-qclhk\") pod \"router-default-5444994796-n4vzm\" (UID: \"58b6f788-e88b-4751-b06d-3fb68a316f91\") " pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.912262 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r9bvz" Oct 01 16:04:49 crc kubenswrapper[4764]: W1001 16:04:49.915116 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25878fb0_4dc8_47aa_b15d_ad43c06319a3.slice/crio-7b07bea838675606693198ff48a637665f09c137ced56fffb188b15f71d83c2e WatchSource:0}: Error finding container 7b07bea838675606693198ff48a637665f09c137ced56fffb188b15f71d83c2e: Status 404 returned error can't find the container with id 7b07bea838675606693198ff48a637665f09c137ced56fffb188b15f71d83c2e Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.921395 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.929219 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw2cl\" (UniqueName: \"kubernetes.io/projected/972c24b8-8c37-4214-b0f0-6449046a3eca-kube-api-access-rw2cl\") pod \"kube-storage-version-migrator-operator-b67b599dd-4tcjp\" (UID: \"972c24b8-8c37-4214-b0f0-6449046a3eca\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.941545 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.955261 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqlkv\" (UniqueName: \"kubernetes.io/projected/98d6aa4d-7d9b-4b28-82a8-75be1d101b6c-kube-api-access-sqlkv\") pod \"dns-operator-744455d44c-29nrh\" (UID: \"98d6aa4d-7d9b-4b28-82a8-75be1d101b6c\") " pod="openshift-dns-operator/dns-operator-744455d44c-29nrh" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.964382 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/74c0b14c-2e43-4b8c-8bba-eb597ffc13ce-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-d9h6p\" (UID: \"74c0b14c-2e43-4b8c-8bba-eb597ffc13ce\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.975513 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:49 crc kubenswrapper[4764]: E1001 16:04:49.975870 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:50.475852398 +0000 UTC m=+153.475499233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.977254 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.992034 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdt5h\" (UniqueName: \"kubernetes.io/projected/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-kube-api-access-qdt5h\") pod \"collect-profiles-29322240-f5k2s\" (UID: \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" Oct 01 16:04:49 crc kubenswrapper[4764]: I1001 16:04:49.999475 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2gkmm" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.000808 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br89k\" (UniqueName: \"kubernetes.io/projected/d333f596-88f5-4641-b577-fd416e45c25d-kube-api-access-br89k\") pod \"dns-default-6kz6b\" (UID: \"d333f596-88f5-4641-b577-fd416e45c25d\") " pod="openshift-dns/dns-default-6kz6b" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.011387 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6j7l4"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.013263 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.020279 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7br74\" (UniqueName: \"kubernetes.io/projected/15522425-a48b-41f0-8327-8ce597b09dd2-kube-api-access-7br74\") pod \"machine-config-operator-74547568cd-cf72j\" (UID: \"15522425-a48b-41f0-8327-8ce597b09dd2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.029335 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.035203 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cd5qk"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.044937 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.049430 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/01e2a53e-c853-4151-9c6c-a81917e54bf0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-tdrhc\" (UID: \"01e2a53e-c853-4151-9c6c-a81917e54bf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.061980 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.065067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lswq2\" (UniqueName: \"kubernetes.io/projected/eccae5ea-5d95-4d65-97cd-9d8ee4db20bc-kube-api-access-lswq2\") pod \"control-plane-machine-set-operator-78cbb6b69f-lntzx\" (UID: \"eccae5ea-5d95-4d65-97cd-9d8ee4db20bc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.076808 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:50 crc kubenswrapper[4764]: E1001 16:04:50.076991 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:50.576971569 +0000 UTC m=+153.576618404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.077187 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:50 crc kubenswrapper[4764]: E1001 16:04:50.077477 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:50.577469761 +0000 UTC m=+153.577116596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.078382 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" Oct 01 16:04:50 crc kubenswrapper[4764]: W1001 16:04:50.087848 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce380ecc_2685_4ceb_85f6_617c8f7c0eaa.slice/crio-b55ef8c0c01475569bba7dec61f9b3318c8cf7b9107adacb2becff9716d5dc7c WatchSource:0}: Error finding container b55ef8c0c01475569bba7dec61f9b3318c8cf7b9107adacb2becff9716d5dc7c: Status 404 returned error can't find the container with id b55ef8c0c01475569bba7dec61f9b3318c8cf7b9107adacb2becff9716d5dc7c Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.100248 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6kz6b" Oct 01 16:04:50 crc kubenswrapper[4764]: W1001 16:04:50.104827 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dc737f5_37f4_47a3_8716_af033cbe27fc.slice/crio-06ac98415128dc7aa7722425750012ad3abd4d776e9ce4dbcc2b3770eb1937d8 WatchSource:0}: Error finding container 06ac98415128dc7aa7722425750012ad3abd4d776e9ce4dbcc2b3770eb1937d8: Status 404 returned error can't find the container with id 06ac98415128dc7aa7722425750012ad3abd4d776e9ce4dbcc2b3770eb1937d8 Oct 01 16:04:50 crc kubenswrapper[4764]: W1001 16:04:50.106927 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc932b6b4_a3cf_45d5_9f11_1f9feff6fdb2.slice/crio-1eda25385477d0272ffeba455178d403f4af5b68315209afa188ce8b7ff8b2fb WatchSource:0}: Error finding container 1eda25385477d0272ffeba455178d403f4af5b68315209afa188ce8b7ff8b2fb: Status 404 returned error can't find the container with id 1eda25385477d0272ffeba455178d403f4af5b68315209afa188ce8b7ff8b2fb Oct 01 16:04:50 crc kubenswrapper[4764]: W1001 16:04:50.109017 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdbf81ba_852e_4222_8ed8_efd4dbd27a80.slice/crio-83064d9c18d650457413df2da18f731ebdb2564c487ba75387a953f714017b54 WatchSource:0}: Error finding container 83064d9c18d650457413df2da18f731ebdb2564c487ba75387a953f714017b54: Status 404 returned error can't find the container with id 83064d9c18d650457413df2da18f731ebdb2564c487ba75387a953f714017b54 Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.159681 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.169475 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.176411 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.183922 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.187909 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.195960 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.196015 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc"] Oct 01 16:04:50 crc kubenswrapper[4764]: E1001 16:04:50.196566 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:50.696545681 +0000 UTC m=+153.696192516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.203804 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-29nrh" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.232356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.298401 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:50 crc kubenswrapper[4764]: E1001 16:04:50.298677 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:50.798665087 +0000 UTC m=+153.798311922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.306777 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t47fw"] Oct 01 16:04:50 crc kubenswrapper[4764]: W1001 16:04:50.324662 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaedccabf_870c_4bca_9d09_028aa2416702.slice/crio-d1f386a5da4dc69ef3e50d7e450652b7c4085ce627ecf5b7cc9c837a60e1d5c0 WatchSource:0}: Error finding container d1f386a5da4dc69ef3e50d7e450652b7c4085ce627ecf5b7cc9c837a60e1d5c0: Status 404 returned error can't find the container with id d1f386a5da4dc69ef3e50d7e450652b7c4085ce627ecf5b7cc9c837a60e1d5c0 Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.338005 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wc7z5"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.339479 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.358676 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.399607 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:50 crc kubenswrapper[4764]: E1001 16:04:50.399979 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:50.899961024 +0000 UTC m=+153.899607869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.479863 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" event={"ID":"807b336c-8d01-4644-8c75-1e9da97c0d23","Type":"ContainerStarted","Data":"197466e0c472f14e73f0f43d16867bc5582771fbfc7ac764f8add7d18af28d31"} Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.484972 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.490160 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" event={"ID":"fdbf81ba-852e-4222-8ed8-efd4dbd27a80","Type":"ContainerStarted","Data":"83064d9c18d650457413df2da18f731ebdb2564c487ba75387a953f714017b54"} Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.499857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" event={"ID":"1dc737f5-37f4-47a3-8716-af033cbe27fc","Type":"ContainerStarted","Data":"06ac98415128dc7aa7722425750012ad3abd4d776e9ce4dbcc2b3770eb1937d8"} Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.501458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:50 crc kubenswrapper[4764]: E1001 16:04:50.501874 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.001858833 +0000 UTC m=+154.001505668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.504812 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" event={"ID":"bd701ac0-1755-4925-97c5-d0def443d990","Type":"ContainerStarted","Data":"b93fe2003b3e6f9ec8dddde5673601753a4291d593496d36e2244b0e04facc19"} Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.504856 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" event={"ID":"bd701ac0-1755-4925-97c5-d0def443d990","Type":"ContainerStarted","Data":"0bff411471e34c90a87ac9ffe6bca9ea0f80c455fe77df015f3b8d811edb26a6"} Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.506196 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" event={"ID":"7bd2b073-6d2e-4de8-b164-f853c7e01794","Type":"ContainerStarted","Data":"58b4776ec9364e60ed1dfa99206ae4100b40d456b9b3e2af17929a3343166088"} Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.512707 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" event={"ID":"c7afea08-2815-437c-b5ce-26e40f80edda","Type":"ContainerStarted","Data":"f604083dec62b64d6f3da95961185ae1edeca3b1b5c24f21fa8f2ab8a9821d89"} Oct 01 16:04:50 crc kubenswrapper[4764]: W1001 16:04:50.528573 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9b4fd74_b139_4155_920d_139eeda695d5.slice/crio-b06dea1277df3ceed0cc6721b7685ffe80e335f45b32d67f39294a0d1134f4cd WatchSource:0}: Error finding container b06dea1277df3ceed0cc6721b7685ffe80e335f45b32d67f39294a0d1134f4cd: Status 404 returned error can't find the container with id b06dea1277df3ceed0cc6721b7685ffe80e335f45b32d67f39294a0d1134f4cd Oct 01 16:04:50 crc kubenswrapper[4764]: W1001 16:04:50.560367 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf32960fb_d6fb_41bf_8e3c_4c26e3dd80af.slice/crio-2c21a10e0155cec2e28d7a2df45442f91a60b61c3483cd5be5535c3a0c105d5f WatchSource:0}: Error finding container 2c21a10e0155cec2e28d7a2df45442f91a60b61c3483cd5be5535c3a0c105d5f: Status 404 returned error can't find the container with id 2c21a10e0155cec2e28d7a2df45442f91a60b61c3483cd5be5535c3a0c105d5f Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.602642 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:50 crc kubenswrapper[4764]: E1001 16:04:50.602807 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.102773809 +0000 UTC m=+154.102420654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.603275 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:50 crc kubenswrapper[4764]: E1001 16:04:50.603749 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.103739993 +0000 UTC m=+154.103386828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.641941 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dp2zw"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.679940 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.709070 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:50 crc kubenswrapper[4764]: E1001 16:04:50.709242 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.209217803 +0000 UTC m=+154.208864638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:50 crc kubenswrapper[4764]: E1001 16:04:50.710215 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.210207278 +0000 UTC m=+154.209854113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.709349 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.715132 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" event={"ID":"aedccabf-870c-4bca-9d09-028aa2416702","Type":"ContainerStarted","Data":"d1f386a5da4dc69ef3e50d7e450652b7c4085ce627ecf5b7cc9c837a60e1d5c0"} Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.716561 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jcsqb" event={"ID":"25878fb0-4dc8-47aa-b15d-ad43c06319a3","Type":"ContainerStarted","Data":"7b07bea838675606693198ff48a637665f09c137ced56fffb188b15f71d83c2e"} Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.718586 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6kz6b"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.719011 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5trjp" event={"ID":"ce380ecc-2685-4ceb-85f6-617c8f7c0eaa","Type":"ContainerStarted","Data":"b55ef8c0c01475569bba7dec61f9b3318c8cf7b9107adacb2becff9716d5dc7c"} Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.720786 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7wm6j"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.720850 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" event={"ID":"dd433de1-a494-45e1-9a19-1b619fe7c3bc","Type":"ContainerStarted","Data":"4370cf69f4b7618dc794c17b3b66e386037969b85c38ad24618f3c5f0e956407"} Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.722506 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" event={"ID":"fb95f0bc-df36-46d1-9c1e-39f3bdb97735","Type":"ContainerStarted","Data":"b61a8e1018f035ed0ea9b2639eb2be670ea8605a13c8e802c2fbe67104474720"} Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.723440 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wcpvk"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.725682 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" event={"ID":"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2","Type":"ContainerStarted","Data":"1eda25385477d0272ffeba455178d403f4af5b68315209afa188ce8b7ff8b2fb"} Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.732965 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.815665 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:50 crc kubenswrapper[4764]: E1001 16:04:50.815832 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.315806722 +0000 UTC m=+154.315453557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.816197 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:50 crc kubenswrapper[4764]: E1001 16:04:50.820742 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.320717444 +0000 UTC m=+154.320364279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:50 crc kubenswrapper[4764]: W1001 16:04:50.874363 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ee3fedb_c6d1_421a_85f5_a46b964a47b7.slice/crio-183a166bbaa75a22c8a7ec90bd7dea4589b503cf15d961a9e26b0179bfffc798 WatchSource:0}: Error finding container 183a166bbaa75a22c8a7ec90bd7dea4589b503cf15d961a9e26b0179bfffc798: Status 404 returned error can't find the container with id 183a166bbaa75a22c8a7ec90bd7dea4589b503cf15d961a9e26b0179bfffc798 Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.892697 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-927gg" Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.916897 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.918067 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:50 crc kubenswrapper[4764]: E1001 16:04:50.918399 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.418383169 +0000 UTC m=+154.418030004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.927259 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk"] Oct 01 16:04:50 crc kubenswrapper[4764]: I1001 16:04:50.987786 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-r9bvz"] Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.018736 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7"] Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.020155 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.020485 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.520473794 +0000 UTC m=+154.520120629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.112985 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-pfzm8" podStartSLOduration=129.112970979 podStartE2EDuration="2m9.112970979s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:51.112842646 +0000 UTC m=+154.112489481" watchObservedRunningTime="2025-10-01 16:04:51.112970979 +0000 UTC m=+154.112617814" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.118329 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2gkmm"] Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.121520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.121819 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.62180531 +0000 UTC m=+154.621452145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.144303 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt"] Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.146546 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9"] Oct 01 16:04:51 crc kubenswrapper[4764]: W1001 16:04:51.171874 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc04e7251_fb9c_4ba7_b06b_7c058c6b0859.slice/crio-978a54c7a398ff98ddf8a6dcf124aeaac4472b0121b90f8fbfa37bfd0e6d53c9 WatchSource:0}: Error finding container 978a54c7a398ff98ddf8a6dcf124aeaac4472b0121b90f8fbfa37bfd0e6d53c9: Status 404 returned error can't find the container with id 978a54c7a398ff98ddf8a6dcf124aeaac4472b0121b90f8fbfa37bfd0e6d53c9 Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.223816 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.224224 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.724212353 +0000 UTC m=+154.723859188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: W1001 16:04:51.278866 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca157967_0878_4ffe_a609_6c4be43a9ee0.slice/crio-29e56772d7a1808f8ce62f5f9352a400e6cdd9484656002065ccc5e528e2b13f WatchSource:0}: Error finding container 29e56772d7a1808f8ce62f5f9352a400e6cdd9484656002065ccc5e528e2b13f: Status 404 returned error can't find the container with id 29e56772d7a1808f8ce62f5f9352a400e6cdd9484656002065ccc5e528e2b13f Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.292746 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lt4t4" podStartSLOduration=129.292724638 podStartE2EDuration="2m9.292724638s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:51.249288091 +0000 UTC m=+154.248934926" watchObservedRunningTime="2025-10-01 16:04:51.292724638 +0000 UTC m=+154.292371463" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.292859 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9np25" podStartSLOduration=129.292852791 podStartE2EDuration="2m9.292852791s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:51.28840233 +0000 UTC m=+154.288049165" watchObservedRunningTime="2025-10-01 16:04:51.292852791 +0000 UTC m=+154.292499626" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.325245 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.325442 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.825419156 +0000 UTC m=+154.825065981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.325506 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.326122 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.826033552 +0000 UTC m=+154.825680387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.405883 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp"] Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.426678 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.426863 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.926844615 +0000 UTC m=+154.926491450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.427001 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.427409 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:51.927399309 +0000 UTC m=+154.927046144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.469863 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-29nrh"] Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.486214 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc"] Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.511212 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p"] Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.528248 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.528714 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.028697875 +0000 UTC m=+155.028344710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: W1001 16:04:51.555648 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01e2a53e_c853_4151_9c6c_a81917e54bf0.slice/crio-fe65228b276d52526d1c9b15c6ec5955cc2cb8069910967d25d17f07b01bb53a WatchSource:0}: Error finding container fe65228b276d52526d1c9b15c6ec5955cc2cb8069910967d25d17f07b01bb53a: Status 404 returned error can't find the container with id fe65228b276d52526d1c9b15c6ec5955cc2cb8069910967d25d17f07b01bb53a Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.598062 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx"] Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.600751 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-927gg" podStartSLOduration=129.600728367 podStartE2EDuration="2m9.600728367s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:51.597114097 +0000 UTC m=+154.596760932" watchObservedRunningTime="2025-10-01 16:04:51.600728367 +0000 UTC m=+154.600375222" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.606619 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j"] Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.629282 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.629621 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.129609271 +0000 UTC m=+155.129256106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: W1001 16:04:51.698418 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeccae5ea_5d95_4d65_97cd_9d8ee4db20bc.slice/crio-fd09ee8664ce940b930d53153b8ffa7cd4e5057c435d987c62f57bd4dfa42e3a WatchSource:0}: Error finding container fd09ee8664ce940b930d53153b8ffa7cd4e5057c435d987c62f57bd4dfa42e3a: Status 404 returned error can't find the container with id fd09ee8664ce940b930d53153b8ffa7cd4e5057c435d987c62f57bd4dfa42e3a Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.731662 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qp66p" podStartSLOduration=129.731641185 podStartE2EDuration="2m9.731641185s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:51.678877404 +0000 UTC m=+154.678524479" watchObservedRunningTime="2025-10-01 16:04:51.731641185 +0000 UTC m=+154.731288020" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.734205 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.735121 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.235101191 +0000 UTC m=+155.234748026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.736409 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.736822 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.236806803 +0000 UTC m=+155.236453638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.756350 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dp2zw" event={"ID":"4cc85075-9ddf-47e1-9564-615f6f5f26ed","Type":"ContainerStarted","Data":"3f68235a8e32e52856c2d3244576415dfa7ac0c9f55a9dde1cbeb39197b6f2d9"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.756392 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dp2zw" event={"ID":"4cc85075-9ddf-47e1-9564-615f6f5f26ed","Type":"ContainerStarted","Data":"96671498d3d933a6b8f5ba17d8735e385d56ef2f47b6ad435a1a75c744b51258"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.768332 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" event={"ID":"c6d4d4d2-b14a-4159-9755-267ab804edb0","Type":"ContainerStarted","Data":"1ba0d7b9ccbd50fc690aa2be697d6f436a231ceca92a2cfdf07099f4e45ede3f"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.782503 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jcsqb" event={"ID":"25878fb0-4dc8-47aa-b15d-ad43c06319a3","Type":"ContainerStarted","Data":"ea848345d283a5559e7b50b829c2363562ca71ff3d955df20f2d8e67a4eaee40"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.789285 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" event={"ID":"18323a84-6150-4a2d-bc8d-b29b4a0a7ddd","Type":"ContainerStarted","Data":"afc9164c23088e601d16bf8fff5459ce600669fefc27ef32eb0d46d8d472927b"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.796115 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" event={"ID":"c9b4fd74-b139-4155-920d-139eeda695d5","Type":"ContainerStarted","Data":"0f70175f0028af3f09308d19a6680782e401f61b4d37e5416959918d5b24f025"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.796147 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" event={"ID":"c9b4fd74-b139-4155-920d-139eeda695d5","Type":"ContainerStarted","Data":"b06dea1277df3ceed0cc6721b7685ffe80e335f45b32d67f39294a0d1134f4cd"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.799322 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-n4vzm" event={"ID":"58b6f788-e88b-4751-b06d-3fb68a316f91","Type":"ContainerStarted","Data":"b51770f77c97dad906b3684446a32405ca7a38e24396db8efccd5c763fbb38b7"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.804401 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" event={"ID":"01e2a53e-c853-4151-9c6c-a81917e54bf0","Type":"ContainerStarted","Data":"fe65228b276d52526d1c9b15c6ec5955cc2cb8069910967d25d17f07b01bb53a"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.807150 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" podStartSLOduration=129.807136894 podStartE2EDuration="2m9.807136894s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:51.806551109 +0000 UTC m=+154.806197944" watchObservedRunningTime="2025-10-01 16:04:51.807136894 +0000 UTC m=+154.806783729" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.827464 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" event={"ID":"16b03fe5-5fd8-460b-b4a1-0011c47bf79b","Type":"ContainerStarted","Data":"fb9307244a6d55c83d8944c4e188fa79b3f5e3db526c13ae28161c2f43eb03cc"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.827528 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" event={"ID":"16b03fe5-5fd8-460b-b4a1-0011c47bf79b","Type":"ContainerStarted","Data":"985f5812313f3ca89ff7189f849738184e5fb37880a3524a15f64849294f5b1b"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.828403 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.834318 4764 generic.go:334] "Generic (PLEG): container finished" podID="bd701ac0-1755-4925-97c5-d0def443d990" containerID="b93fe2003b3e6f9ec8dddde5673601753a4291d593496d36e2244b0e04facc19" exitCode=0 Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.834669 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" event={"ID":"bd701ac0-1755-4925-97c5-d0def443d990","Type":"ContainerDied","Data":"b93fe2003b3e6f9ec8dddde5673601753a4291d593496d36e2244b0e04facc19"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.835147 4764 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-jgb79 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.835275 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" podUID="16b03fe5-5fd8-460b-b4a1-0011c47bf79b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.837764 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.840178 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.340133589 +0000 UTC m=+155.339780444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.849197 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.851338 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" event={"ID":"dd433de1-a494-45e1-9a19-1b619fe7c3bc","Type":"ContainerStarted","Data":"830cb00e77da2523a735cfbb17727c6d7dfb0a708ad6377f7f8f431ba3d71cd2"} Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.852070 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.352003407 +0000 UTC m=+155.351650242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.871185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" event={"ID":"964956b3-18e0-4894-b0c0-960a6770384e","Type":"ContainerStarted","Data":"5a4495d4c1a7fa5abd3f7e5ee855d87d0c16be83db78d2f9c7d3576dfd62786d"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.871233 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" event={"ID":"964956b3-18e0-4894-b0c0-960a6770384e","Type":"ContainerStarted","Data":"cea47eeefb83b6795caea91fa9debb1460dd165eaaf4aad8523334c72bd2dcf3"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.881940 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" event={"ID":"807b336c-8d01-4644-8c75-1e9da97c0d23","Type":"ContainerStarted","Data":"b778a958b711ba573799d121a2719ccf748d5ea78db0b05a926c1155f49b55e1"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.890172 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" event={"ID":"c199b404-aae6-4d71-8aab-022be036624d","Type":"ContainerStarted","Data":"5fec5f3083d83c72e59748526730a78f364f067428cb404b5c6bdf184f402d5c"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.890215 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" event={"ID":"c199b404-aae6-4d71-8aab-022be036624d","Type":"ContainerStarted","Data":"c711c237992a74715fd4f7710ba1acea90e78dd5e47fad417e193f33f6859a83"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.894587 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" event={"ID":"fb9458af-3147-4e40-8816-2d50eeaee101","Type":"ContainerStarted","Data":"4a0cbb52887e0a85db3c0b5677b726afb7eb3e8ca6879b2072ac1e9a205a207b"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.899951 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-29nrh" event={"ID":"98d6aa4d-7d9b-4b28-82a8-75be1d101b6c","Type":"ContainerStarted","Data":"c9d9549369de81be3362a8fd3055b8c2eed181ac73c273cabdaaef951ae147eb"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.905119 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" event={"ID":"c7afea08-2815-437c-b5ce-26e40f80edda","Type":"ContainerStarted","Data":"6450209107d94c96efa9f69e72401f4a9976feb763e37dd3117858432f9c0e8e"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.908006 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" event={"ID":"83c8dab0-d64e-45be-b358-0baaaf1eca60","Type":"ContainerStarted","Data":"fc07faa7a1f9dc7d5d5a5a4bd40f835827de0af024c07e7e270901255dbed9c0"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.911094 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5trjp" event={"ID":"ce380ecc-2685-4ceb-85f6-617c8f7c0eaa","Type":"ContainerStarted","Data":"796b4032cca6ca85a8afbdde22e6761298cb17976a97d184d65c4001eac3d038"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.911899 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5trjp" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.913405 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-5trjp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.913438 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5trjp" podUID="ce380ecc-2685-4ceb-85f6-617c8f7c0eaa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.913706 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.913760 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.919449 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" event={"ID":"74c0b14c-2e43-4b8c-8bba-eb597ffc13ce","Type":"ContainerStarted","Data":"173bc44b07a8b659e9ff56a2c0f1a97d4b3ccafd043b2f3abf284a0a9b002fa0"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.924883 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kz6b" event={"ID":"d333f596-88f5-4641-b577-fd416e45c25d","Type":"ContainerStarted","Data":"8355e243cd2056fe82ed5d70a4c939538754bbec4a8d9fdf58353a7c5cc0decc"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.931643 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" event={"ID":"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2","Type":"ContainerStarted","Data":"7065957328a7799ae3dcef5ecdd3eb2a47668cac2b9b2fe1ad3aec93514e950f"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.932700 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.934548 4764 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cd5qk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.934584 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" podUID="c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.936950 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" event={"ID":"fb95f0bc-df36-46d1-9c1e-39f3bdb97735","Type":"ContainerStarted","Data":"2bdcec489a28bc6b54aca14219a890c2dbfa9240c91adef2b8f07ae0f2708c75"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.940063 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2gkmm" event={"ID":"ca157967-0878-4ffe-a609-6c4be43a9ee0","Type":"ContainerStarted","Data":"29e56772d7a1808f8ce62f5f9352a400e6cdd9484656002065ccc5e528e2b13f"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.947475 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" event={"ID":"0ee3fedb-c6d1-421a-85f5-a46b964a47b7","Type":"ContainerStarted","Data":"80a68769d3109f5d39ed0639e14ea526a1ac429239bbd2624ac38aff4a7b818c"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.947512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" event={"ID":"0ee3fedb-c6d1-421a-85f5-a46b964a47b7","Type":"ContainerStarted","Data":"183a166bbaa75a22c8a7ec90bd7dea4589b503cf15d961a9e26b0179bfffc798"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.951896 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.952027 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.45200476 +0000 UTC m=+155.451651595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.952169 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.952221 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" event={"ID":"b81be073-688e-4025-8973-fd9147b4a8fa","Type":"ContainerStarted","Data":"271da663ef59ed16032724342aca30f6cd91de5bff3443231b66a6f99846a8a4"} Oct 01 16:04:51 crc kubenswrapper[4764]: E1001 16:04:51.952721 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.452614604 +0000 UTC m=+155.452261439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.954476 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" event={"ID":"15522425-a48b-41f0-8327-8ce597b09dd2","Type":"ContainerStarted","Data":"35ba50e37dbd266cd5c377c0a080cbf35edf2ef8aa213d566c4bc0db249c6357"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.962832 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" event={"ID":"aedccabf-870c-4bca-9d09-028aa2416702","Type":"ContainerStarted","Data":"d7a75f11f4623f4255211cca3376549e5cdb9ab48ec5a134aa7014cc1e28103d"} Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.963882 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.966638 4764 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-n5qtc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 01 16:04:51 crc kubenswrapper[4764]: I1001 16:04:51.966696 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" podUID="aedccabf-870c-4bca-9d09-028aa2416702" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.001768 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" event={"ID":"972c24b8-8c37-4214-b0f0-6449046a3eca","Type":"ContainerStarted","Data":"2c29cd3a65e36a3a7f851030343576601d5cb89448367e409fc450fa6041192e"} Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.055428 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" event={"ID":"5101da2b-8945-460a-951e-a61632f28c98","Type":"ContainerStarted","Data":"2713a74e65972ab7fe50c5719f6ae027dacb6e245b0fb18a4afc7dc31e1be102"} Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.055676 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.055799 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.555776237 +0000 UTC m=+155.555423072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.057545 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.057971 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.557955292 +0000 UTC m=+155.557602227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.074365 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r9bvz" event={"ID":"c04e7251-fb9c-4ba7-b06b-7c058c6b0859","Type":"ContainerStarted","Data":"978a54c7a398ff98ddf8a6dcf124aeaac4472b0121b90f8fbfa37bfd0e6d53c9"} Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.087620 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" podStartSLOduration=130.087589733 podStartE2EDuration="2m10.087589733s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.084757843 +0000 UTC m=+155.084404678" watchObservedRunningTime="2025-10-01 16:04:52.087589733 +0000 UTC m=+155.087236568" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.092037 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" event={"ID":"f32960fb-d6fb-41bf-8e3c-4c26e3dd80af","Type":"ContainerStarted","Data":"2680ede6a935c101a140b075d683726e05a4f2e7607f8d475c286703fc96593a"} Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.092133 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" event={"ID":"f32960fb-d6fb-41bf-8e3c-4c26e3dd80af","Type":"ContainerStarted","Data":"2c21a10e0155cec2e28d7a2df45442f91a60b61c3483cd5be5535c3a0c105d5f"} Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.123612 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" podStartSLOduration=130.123589905 podStartE2EDuration="2m10.123589905s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.1206304 +0000 UTC m=+155.120277245" watchObservedRunningTime="2025-10-01 16:04:52.123589905 +0000 UTC m=+155.123236740" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.127894 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" event={"ID":"1dc737f5-37f4-47a3-8716-af033cbe27fc","Type":"ContainerStarted","Data":"6bf3b8f25815634bcc2b6e429f92701176838961f57da4fcf167a89026396f0f"} Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.128742 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.133517 4764 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6j7l4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.133575 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" podUID="1dc737f5-37f4-47a3-8716-af033cbe27fc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.136696 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx" event={"ID":"eccae5ea-5d95-4d65-97cd-9d8ee4db20bc","Type":"ContainerStarted","Data":"fd09ee8664ce940b930d53153b8ffa7cd4e5057c435d987c62f57bd4dfa42e3a"} Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.141495 4764 generic.go:334] "Generic (PLEG): container finished" podID="7bd2b073-6d2e-4de8-b164-f853c7e01794" containerID="6f454f2367c0f71cdc20086ca3f6c6ccb21ea5df992399ca3af415bc2f1aa594" exitCode=0 Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.141570 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" event={"ID":"7bd2b073-6d2e-4de8-b164-f853c7e01794","Type":"ContainerDied","Data":"6f454f2367c0f71cdc20086ca3f6c6ccb21ea5df992399ca3af415bc2f1aa594"} Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.160936 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.161029 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.661009881 +0000 UTC m=+155.660656716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.161242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" event={"ID":"fdbf81ba-852e-4222-8ed8-efd4dbd27a80","Type":"ContainerStarted","Data":"2c630c5d5aab1087a641a37f097e7fd727b2a53cb94d686c0cb67d27538a2de4"} Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.161674 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.165613 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.665595626 +0000 UTC m=+155.665242461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.266236 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.266399 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.766368518 +0000 UTC m=+155.766015353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.266682 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.267000 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.766987603 +0000 UTC m=+155.766634438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.282026 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" podStartSLOduration=130.28200948 podStartE2EDuration="2m10.28200948s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.28005925 +0000 UTC m=+155.279706105" watchObservedRunningTime="2025-10-01 16:04:52.28200948 +0000 UTC m=+155.281656315" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.327336 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" podStartSLOduration=130.327317353 podStartE2EDuration="2m10.327317353s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.325433337 +0000 UTC m=+155.325080192" watchObservedRunningTime="2025-10-01 16:04:52.327317353 +0000 UTC m=+155.326964188" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.367734 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.369092 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.869066619 +0000 UTC m=+155.868713534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.406776 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-5trjp" podStartSLOduration=130.406757992 podStartE2EDuration="2m10.406757992s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.404827823 +0000 UTC m=+155.404474658" watchObservedRunningTime="2025-10-01 16:04:52.406757992 +0000 UTC m=+155.406404827" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.408854 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vdnsd" podStartSLOduration=130.408838804 podStartE2EDuration="2m10.408838804s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.36274042 +0000 UTC m=+155.362387255" watchObservedRunningTime="2025-10-01 16:04:52.408838804 +0000 UTC m=+155.408485639" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.444927 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sv6hf" podStartSLOduration=130.444907657 podStartE2EDuration="2m10.444907657s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.442698141 +0000 UTC m=+155.442344986" watchObservedRunningTime="2025-10-01 16:04:52.444907657 +0000 UTC m=+155.444554502" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.470948 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.471307 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:52.971294267 +0000 UTC m=+155.970941102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.483160 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-jcsqb" podStartSLOduration=5.483140313 podStartE2EDuration="5.483140313s" podCreationTimestamp="2025-10-01 16:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.479615996 +0000 UTC m=+155.479262831" watchObservedRunningTime="2025-10-01 16:04:52.483140313 +0000 UTC m=+155.482787148" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.537105 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dp2zw" podStartSLOduration=5.537086734 podStartE2EDuration="5.537086734s" podCreationTimestamp="2025-10-01 16:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.534582631 +0000 UTC m=+155.534229466" watchObservedRunningTime="2025-10-01 16:04:52.537086734 +0000 UTC m=+155.536733569" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.574505 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.574854 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:53.074839449 +0000 UTC m=+156.074486284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.612065 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sklbl" podStartSLOduration=130.612026449 podStartE2EDuration="2m10.612026449s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.574741487 +0000 UTC m=+155.574388322" watchObservedRunningTime="2025-10-01 16:04:52.612026449 +0000 UTC m=+155.611673284" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.612923 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-t47fw" podStartSLOduration=130.612916442 podStartE2EDuration="2m10.612916442s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.607179638 +0000 UTC m=+155.606826503" watchObservedRunningTime="2025-10-01 16:04:52.612916442 +0000 UTC m=+155.612563287" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.649872 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" podStartSLOduration=130.649854176 podStartE2EDuration="2m10.649854176s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.648169454 +0000 UTC m=+155.647816309" watchObservedRunningTime="2025-10-01 16:04:52.649854176 +0000 UTC m=+155.649501011" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.683845 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.684241 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:53.184230307 +0000 UTC m=+156.183877142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.689782 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wc7z5" podStartSLOduration=130.689764905 podStartE2EDuration="2m10.689764905s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.684006051 +0000 UTC m=+155.683652886" watchObservedRunningTime="2025-10-01 16:04:52.689764905 +0000 UTC m=+155.689411740" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.780007 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" podStartSLOduration=130.779992793 podStartE2EDuration="2m10.779992793s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.728442024 +0000 UTC m=+155.728088869" watchObservedRunningTime="2025-10-01 16:04:52.779992793 +0000 UTC m=+155.779639628" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.785060 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.785328 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:53.285313087 +0000 UTC m=+156.284959922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.810584 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" podStartSLOduration=130.810570239 podStartE2EDuration="2m10.810570239s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:52.80859369 +0000 UTC m=+155.808240525" watchObservedRunningTime="2025-10-01 16:04:52.810570239 +0000 UTC m=+155.810217074" Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.886941 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.887257 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:53.387243178 +0000 UTC m=+156.386890013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.987756 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.987946 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:53.487927958 +0000 UTC m=+156.487574793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:52 crc kubenswrapper[4764]: I1001 16:04:52.988558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:52 crc kubenswrapper[4764]: E1001 16:04:52.989020 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:53.489008565 +0000 UTC m=+156.488655401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.090121 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:53 crc kubenswrapper[4764]: E1001 16:04:53.090327 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:53.59029471 +0000 UTC m=+156.589941545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.090408 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:53 crc kubenswrapper[4764]: E1001 16:04:53.090801 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:53.590788972 +0000 UTC m=+156.590435877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.166448 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" event={"ID":"bd701ac0-1755-4925-97c5-d0def443d990","Type":"ContainerStarted","Data":"31ae4d8b59d50ef5132995276dce5017cea67fbeb25b589f32bd117571e17d60"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.166601 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.168869 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r9bvz" event={"ID":"c04e7251-fb9c-4ba7-b06b-7c058c6b0859","Type":"ContainerStarted","Data":"0c353aa29ba267280dc3ab746b937616a3e483370f1ddeab75d0573108d086c1"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.168917 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r9bvz" event={"ID":"c04e7251-fb9c-4ba7-b06b-7c058c6b0859","Type":"ContainerStarted","Data":"007bedb39873c74034670fec421ca515c9020a0de4623c1e4d2f3d128d725a3d"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.170524 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" event={"ID":"fb95f0bc-df36-46d1-9c1e-39f3bdb97735","Type":"ContainerStarted","Data":"5f3a4824542f0f7a04b9e94afb7f846d878cb5760f7083f0add36d2f02da1ba5"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.172753 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" event={"ID":"5101da2b-8945-460a-951e-a61632f28c98","Type":"ContainerStarted","Data":"e57053b2dbb0be28abfee93ba975732fc6b8a633365e81d296e98fc303b1d50d"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.174234 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" event={"ID":"01e2a53e-c853-4151-9c6c-a81917e54bf0","Type":"ContainerStarted","Data":"7eec3f5f5a57c1ab69d76b213d1f8ddfa168f56e4db4ad2f976fa9763fb11470"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.174284 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" event={"ID":"01e2a53e-c853-4151-9c6c-a81917e54bf0","Type":"ContainerStarted","Data":"be7b2b6e583114816c8c06be8ea46e7a29b712e12c510dcfa1d3e5860b3f0182"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.176222 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2gkmm" event={"ID":"ca157967-0878-4ffe-a609-6c4be43a9ee0","Type":"ContainerStarted","Data":"2e67c64d2fa4b42813795ca1b3508c15af24e88c06ea58c7e75fc1300a447bae"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.176267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2gkmm" event={"ID":"ca157967-0878-4ffe-a609-6c4be43a9ee0","Type":"ContainerStarted","Data":"a3b8e7dac778c4e8cb77ce9907271432945c75a3d42863016061b264c5f55f4b"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.177293 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" event={"ID":"83c8dab0-d64e-45be-b358-0baaaf1eca60","Type":"ContainerStarted","Data":"2e6d7064d717146f142c05c5a08ea42a8c47ef6cebcf31c1623f9f6822195e2a"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.177501 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.178594 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" event={"ID":"972c24b8-8c37-4214-b0f0-6449046a3eca","Type":"ContainerStarted","Data":"6e4a14cbd3466d6e37b8590bcf00859c16a3de05f1bfd0c36c680007616db502"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.179253 4764 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-bzmh9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.179314 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" podUID="83c8dab0-d64e-45be-b358-0baaaf1eca60" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.185637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" event={"ID":"c6d4d4d2-b14a-4159-9755-267ab804edb0","Type":"ContainerStarted","Data":"9bc903bbd94bf2b0c454189b821ec83825162a286fad3cc3ebe1a5e3cbd0ffdb"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.191747 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:53 crc kubenswrapper[4764]: E1001 16:04:53.191887 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:53.691868663 +0000 UTC m=+156.691515498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.191995 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:53 crc kubenswrapper[4764]: E1001 16:04:53.192339 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:53.692329934 +0000 UTC m=+156.691976769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.193472 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" event={"ID":"18323a84-6150-4a2d-bc8d-b29b4a0a7ddd","Type":"ContainerStarted","Data":"d2e990a079af0a920474c8d37cb4e608bd50a1c2dcd8fc41a19927a62aa1322c"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.193515 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" event={"ID":"18323a84-6150-4a2d-bc8d-b29b4a0a7ddd","Type":"ContainerStarted","Data":"41834ba3f760d9a363c8d83e14d40e9ff51346980edfa4847ee6addeadf3686d"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.197160 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" event={"ID":"15522425-a48b-41f0-8327-8ce597b09dd2","Type":"ContainerStarted","Data":"efdd70227aafe39d5d9d50cce8ced89c8a6db3bc5bb18255fb111375a581d984"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.197217 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" event={"ID":"15522425-a48b-41f0-8327-8ce597b09dd2","Type":"ContainerStarted","Data":"4dbfd19910bf7e07e16d2de53b369bce7b7c8fa4d067a97c10f7ab933d7a7d1a"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.202174 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-29nrh" event={"ID":"98d6aa4d-7d9b-4b28-82a8-75be1d101b6c","Type":"ContainerStarted","Data":"33cf895cc461926831b231a4fa6bf3279c5128a9f2f4d2095ec8bedb50cc002f"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.202226 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-29nrh" event={"ID":"98d6aa4d-7d9b-4b28-82a8-75be1d101b6c","Type":"ContainerStarted","Data":"f4f0c4b684da61d8d6cd19a21a47ebb8a725d13dcac2eac0ad1a79137ab01715"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.204474 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kz6b" event={"ID":"d333f596-88f5-4641-b577-fd416e45c25d","Type":"ContainerStarted","Data":"197b8d2c220417bab5855d0a14b0a46eedb7a8a3007deab633971cd4bd369e40"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.204515 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kz6b" event={"ID":"d333f596-88f5-4641-b577-fd416e45c25d","Type":"ContainerStarted","Data":"5a3d09e61478518b5833779dba27e785dfe0968eade7682e5f045a4ba4579481"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.204601 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-6kz6b" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.208547 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx" event={"ID":"eccae5ea-5d95-4d65-97cd-9d8ee4db20bc","Type":"ContainerStarted","Data":"4b9749b9a8d8a963db143b9f89693ec4ce00e82e61146abaa31a01d901583623"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.210228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" event={"ID":"b81be073-688e-4025-8973-fd9147b4a8fa","Type":"ContainerStarted","Data":"65f923efb9f7a7d8cd4315b481805ced7ca5255a5fd188bf6dea2decdc073a6e"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.212190 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" event={"ID":"7bd2b073-6d2e-4de8-b164-f853c7e01794","Type":"ContainerStarted","Data":"90ec61d4c5ded6dbca200e981e866ab585f3cbc25f708b1f24c6d75d2f0b4619"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.213964 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" event={"ID":"74c0b14c-2e43-4b8c-8bba-eb597ffc13ce","Type":"ContainerStarted","Data":"3dea0dcf6572c9be2295523108fd3094314c1785290c507a19b27fcb53333c36"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.216201 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" event={"ID":"f32960fb-d6fb-41bf-8e3c-4c26e3dd80af","Type":"ContainerStarted","Data":"1b1ad9a44b2b4c0cdbfe20353418ece58c1ed037aeb8e54c239fd3000d77e794"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.216428 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.219653 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-n4vzm" event={"ID":"58b6f788-e88b-4751-b06d-3fb68a316f91","Type":"ContainerStarted","Data":"1b85de2899062615bd6a0e5dfcfb0b9e900f9fc2644c0660e5882dae348155a9"} Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.220258 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-5trjp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.220325 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5trjp" podUID="ce380ecc-2685-4ceb-85f6-617c8f7c0eaa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.220383 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.220499 4764 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6j7l4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.220543 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" podUID="1dc737f5-37f4-47a3-8716-af033cbe27fc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.220860 4764 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cd5qk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.220887 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" podUID="c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.235841 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jgb79" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.266398 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4tcjp" podStartSLOduration=131.266375398 podStartE2EDuration="2m11.266375398s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.263715441 +0000 UTC m=+156.263362296" watchObservedRunningTime="2025-10-01 16:04:53.266375398 +0000 UTC m=+156.266022233" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.266561 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" podStartSLOduration=131.266556522 podStartE2EDuration="2m11.266556522s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.228348236 +0000 UTC m=+156.227995071" watchObservedRunningTime="2025-10-01 16:04:53.266556522 +0000 UTC m=+156.266203357" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.289356 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-mrdz7" podStartSLOduration=131.289335052 podStartE2EDuration="2m11.289335052s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.286828289 +0000 UTC m=+156.286475124" watchObservedRunningTime="2025-10-01 16:04:53.289335052 +0000 UTC m=+156.288981887" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.293147 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:53 crc kubenswrapper[4764]: E1001 16:04:53.293353 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:53.793326442 +0000 UTC m=+156.792973277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.293926 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:53 crc kubenswrapper[4764]: E1001 16:04:53.296654 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:53.796634025 +0000 UTC m=+156.796280880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.323815 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-n4vzm" podStartSLOduration=131.323793815 podStartE2EDuration="2m11.323793815s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.32282438 +0000 UTC m=+156.322471225" watchObservedRunningTime="2025-10-01 16:04:53.323793815 +0000 UTC m=+156.323440650" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.329851 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.357474 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2gkmm" podStartSLOduration=131.357451237 podStartE2EDuration="2m11.357451237s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.352011551 +0000 UTC m=+156.351658386" watchObservedRunningTime="2025-10-01 16:04:53.357451237 +0000 UTC m=+156.357098072" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.402638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.403402 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mt8wt" podStartSLOduration=131.403384877 podStartE2EDuration="2m11.403384877s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.401957751 +0000 UTC m=+156.401604586" watchObservedRunningTime="2025-10-01 16:04:53.403384877 +0000 UTC m=+156.403031712" Oct 01 16:04:53 crc kubenswrapper[4764]: E1001 16:04:53.404232 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:53.904215937 +0000 UTC m=+156.903862762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.433441 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.433515 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.504818 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:53 crc kubenswrapper[4764]: E1001 16:04:53.505457 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:54.005442882 +0000 UTC m=+157.005089717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.523250 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-d9h6p" podStartSLOduration=131.523231186 podStartE2EDuration="2m11.523231186s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.518085798 +0000 UTC m=+156.517732633" watchObservedRunningTime="2025-10-01 16:04:53.523231186 +0000 UTC m=+156.522878021" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.523402 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" podStartSLOduration=131.52339818 podStartE2EDuration="2m11.52339818s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.461314367 +0000 UTC m=+156.460961222" watchObservedRunningTime="2025-10-01 16:04:53.52339818 +0000 UTC m=+156.523045015" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.555714 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7wm6j" podStartSLOduration=131.555693229 podStartE2EDuration="2m11.555693229s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.554764996 +0000 UTC m=+156.554411841" watchObservedRunningTime="2025-10-01 16:04:53.555693229 +0000 UTC m=+156.555340064" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.587922 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cf72j" podStartSLOduration=131.587901645 podStartE2EDuration="2m11.587901645s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.587096215 +0000 UTC m=+156.586743070" watchObservedRunningTime="2025-10-01 16:04:53.587901645 +0000 UTC m=+156.587548500" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.608177 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:53 crc kubenswrapper[4764]: E1001 16:04:53.608560 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:54.108545372 +0000 UTC m=+157.108192207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.636304 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-29nrh" podStartSLOduration=131.636288106 podStartE2EDuration="2m11.636288106s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.634502422 +0000 UTC m=+156.634149267" watchObservedRunningTime="2025-10-01 16:04:53.636288106 +0000 UTC m=+156.635934941" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.691567 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qb64g" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.709882 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:53 crc kubenswrapper[4764]: E1001 16:04:53.710238 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:54.210222327 +0000 UTC m=+157.209869162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.771192 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lntzx" podStartSLOduration=131.771172962 podStartE2EDuration="2m11.771172962s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.67440198 +0000 UTC m=+156.674048815" watchObservedRunningTime="2025-10-01 16:04:53.771172962 +0000 UTC m=+156.770819797" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.800345 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ph942" podStartSLOduration=132.800326372 podStartE2EDuration="2m12.800326372s" podCreationTimestamp="2025-10-01 16:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.774343922 +0000 UTC m=+156.773990757" watchObservedRunningTime="2025-10-01 16:04:53.800326372 +0000 UTC m=+156.799973207" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.810707 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:53 crc kubenswrapper[4764]: E1001 16:04:53.810914 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:54.310888107 +0000 UTC m=+157.310534942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.810966 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:53 crc kubenswrapper[4764]: E1001 16:04:53.811260 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:54.311250205 +0000 UTC m=+157.310897040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.827444 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-tdrhc" podStartSLOduration=131.82742592 podStartE2EDuration="2m11.82742592s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.806376283 +0000 UTC m=+156.806023118" watchObservedRunningTime="2025-10-01 16:04:53.82742592 +0000 UTC m=+156.827072755" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.877324 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" podStartSLOduration=131.877310199 podStartE2EDuration="2m11.877310199s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.876339814 +0000 UTC m=+156.875986649" watchObservedRunningTime="2025-10-01 16:04:53.877310199 +0000 UTC m=+156.876957034" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.878923 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r9bvz" podStartSLOduration=131.878917109 podStartE2EDuration="2m11.878917109s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.833281897 +0000 UTC m=+156.832928732" watchObservedRunningTime="2025-10-01 16:04:53.878917109 +0000 UTC m=+156.878563944" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.912647 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:53 crc kubenswrapper[4764]: E1001 16:04:53.912963 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:54.412950211 +0000 UTC m=+157.412597046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.950067 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" podStartSLOduration=131.950019429 podStartE2EDuration="2m11.950019429s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.906287384 +0000 UTC m=+156.905934219" watchObservedRunningTime="2025-10-01 16:04:53.950019429 +0000 UTC m=+156.949666274" Oct 01 16:04:53 crc kubenswrapper[4764]: I1001 16:04:53.972862 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6kz6b" podStartSLOduration=6.97284598 podStartE2EDuration="6.97284598s" podCreationTimestamp="2025-10-01 16:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.951309591 +0000 UTC m=+156.950956426" watchObservedRunningTime="2025-10-01 16:04:53.97284598 +0000 UTC m=+156.972492815" Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.014101 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:54 crc kubenswrapper[4764]: E1001 16:04:54.014425 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:54.51441482 +0000 UTC m=+157.514061655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.014613 4764 patch_prober.go:28] interesting pod/apiserver-76f77b778f-6bg4z container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 01 16:04:54 crc kubenswrapper[4764]: [+]log ok Oct 01 16:04:54 crc kubenswrapper[4764]: [+]etcd ok Oct 01 16:04:54 crc kubenswrapper[4764]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 01 16:04:54 crc kubenswrapper[4764]: [+]poststarthook/generic-apiserver-start-informers ok Oct 01 16:04:54 crc kubenswrapper[4764]: [+]poststarthook/max-in-flight-filter ok Oct 01 16:04:54 crc kubenswrapper[4764]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 01 16:04:54 crc kubenswrapper[4764]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 01 16:04:54 crc kubenswrapper[4764]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 01 16:04:54 crc kubenswrapper[4764]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 01 16:04:54 crc kubenswrapper[4764]: [+]poststarthook/project.openshift.io-projectcache ok Oct 01 16:04:54 crc kubenswrapper[4764]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 01 16:04:54 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-startinformers ok Oct 01 16:04:54 crc kubenswrapper[4764]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 01 16:04:54 crc kubenswrapper[4764]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 01 16:04:54 crc kubenswrapper[4764]: livez check failed Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.014659 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" podUID="c7afea08-2815-437c-b5ce-26e40f80edda" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.033194 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tdvpk" podStartSLOduration=132.03317789 podStartE2EDuration="2m12.03317789s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:53.975430775 +0000 UTC m=+156.975077610" watchObservedRunningTime="2025-10-01 16:04:54.03317789 +0000 UTC m=+157.032824725" Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.115329 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:54 crc kubenswrapper[4764]: E1001 16:04:54.115506 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:54.61548202 +0000 UTC m=+157.615128855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.115599 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:54 crc kubenswrapper[4764]: E1001 16:04:54.115902 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:54.61589472 +0000 UTC m=+157.615541555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.187344 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.191289 4764 patch_prober.go:28] interesting pod/router-default-5444994796-n4vzm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 16:04:54 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Oct 01 16:04:54 crc kubenswrapper[4764]: [+]process-running ok Oct 01 16:04:54 crc kubenswrapper[4764]: healthz check failed Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.191336 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4vzm" podUID="58b6f788-e88b-4751-b06d-3fb68a316f91" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.216305 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:54 crc kubenswrapper[4764]: E1001 16:04:54.216731 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:54.716716344 +0000 UTC m=+157.716363169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.265367 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" event={"ID":"fb9458af-3147-4e40-8816-2d50eeaee101","Type":"ContainerStarted","Data":"3f79ebaf06f096482724e77603137046a5fe294df3a4b8636b5a70e10a373a1f"} Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.267022 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-5trjp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.267072 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5trjp" podUID="ce380ecc-2685-4ceb-85f6-617c8f7c0eaa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.280991 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.323410 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:54 crc kubenswrapper[4764]: E1001 16:04:54.323840 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:54.823824895 +0000 UTC m=+157.823471730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.372141 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bzmh9" Oct 01 16:04:54 crc kubenswrapper[4764]: E1001 16:04:54.424710 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:54.924693149 +0000 UTC m=+157.924339984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.424738 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.425262 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:54 crc kubenswrapper[4764]: E1001 16:04:54.440345 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:54.940329641 +0000 UTC m=+157.939976466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.461460 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.461519 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.529456 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:54 crc kubenswrapper[4764]: E1001 16:04:54.529825 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:55.02981067 +0000 UTC m=+158.029457505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.630984 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:54 crc kubenswrapper[4764]: E1001 16:04:54.631414 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:55.131401614 +0000 UTC m=+158.131048449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.732739 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:54 crc kubenswrapper[4764]: E1001 16:04:54.733031 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:55.233017326 +0000 UTC m=+158.232664161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.833996 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:54 crc kubenswrapper[4764]: E1001 16:04:54.834292 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:55.334280742 +0000 UTC m=+158.333927577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.934635 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:54 crc kubenswrapper[4764]: E1001 16:04:54.934779 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:55.434759466 +0000 UTC m=+158.434406301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.934885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:54 crc kubenswrapper[4764]: E1001 16:04:54.935149 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:55.435140886 +0000 UTC m=+158.434787721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.975566 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qzcqj"] Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.976456 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:04:54 crc kubenswrapper[4764]: I1001 16:04:54.978071 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.005478 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qzcqj"] Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.038481 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:55 crc kubenswrapper[4764]: E1001 16:04:55.038766 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:55.538751789 +0000 UTC m=+158.538398624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.139601 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.139664 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-catalog-content\") pod \"community-operators-qzcqj\" (UID: \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\") " pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.139710 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-utilities\") pod \"community-operators-qzcqj\" (UID: \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\") " pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.139735 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt464\" (UniqueName: \"kubernetes.io/projected/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-kube-api-access-xt464\") pod \"community-operators-qzcqj\" (UID: \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\") " pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:04:55 crc kubenswrapper[4764]: E1001 16:04:55.140004 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:55.639993363 +0000 UTC m=+158.639640198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.166066 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q5pm5"] Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.166933 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.170080 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.191712 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q5pm5"] Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.192080 4764 patch_prober.go:28] interesting pod/router-default-5444994796-n4vzm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 16:04:55 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Oct 01 16:04:55 crc kubenswrapper[4764]: [+]process-running ok Oct 01 16:04:55 crc kubenswrapper[4764]: healthz check failed Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.192109 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4vzm" podUID="58b6f788-e88b-4751-b06d-3fb68a316f91" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.240452 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.240676 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-utilities\") pod \"community-operators-qzcqj\" (UID: \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\") " pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.240711 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt464\" (UniqueName: \"kubernetes.io/projected/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-kube-api-access-xt464\") pod \"community-operators-qzcqj\" (UID: \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\") " pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.240771 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-catalog-content\") pod \"community-operators-qzcqj\" (UID: \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\") " pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.241231 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-catalog-content\") pod \"community-operators-qzcqj\" (UID: \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\") " pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:04:55 crc kubenswrapper[4764]: E1001 16:04:55.241740 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:55.741280428 +0000 UTC m=+158.740927253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.241812 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-utilities\") pod \"community-operators-qzcqj\" (UID: \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\") " pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.270342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt464\" (UniqueName: \"kubernetes.io/projected/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-kube-api-access-xt464\") pod \"community-operators-qzcqj\" (UID: \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\") " pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.284346 4764 generic.go:334] "Generic (PLEG): container finished" podID="0ee3fedb-c6d1-421a-85f5-a46b964a47b7" containerID="80a68769d3109f5d39ed0639e14ea526a1ac429239bbd2624ac38aff4a7b818c" exitCode=0 Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.284424 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" event={"ID":"0ee3fedb-c6d1-421a-85f5-a46b964a47b7","Type":"ContainerDied","Data":"80a68769d3109f5d39ed0639e14ea526a1ac429239bbd2624ac38aff4a7b818c"} Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.301169 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" event={"ID":"fb9458af-3147-4e40-8816-2d50eeaee101","Type":"ContainerStarted","Data":"72073b9f1b42b5c8262cfc377c1f8602b694f2d1ac05fe8e7c0d5730270931c8"} Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.341854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.342019 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s98gc\" (UniqueName: \"kubernetes.io/projected/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-kube-api-access-s98gc\") pod \"certified-operators-q5pm5\" (UID: \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\") " pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.342098 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-utilities\") pod \"certified-operators-q5pm5\" (UID: \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\") " pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.342130 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-catalog-content\") pod \"certified-operators-q5pm5\" (UID: \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\") " pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:04:55 crc kubenswrapper[4764]: E1001 16:04:55.342323 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:55.842311026 +0000 UTC m=+158.841957861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.342961 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.357157 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-62xnd"] Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.357967 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.380858 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-62xnd"] Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.442598 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.442852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s98gc\" (UniqueName: \"kubernetes.io/projected/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-kube-api-access-s98gc\") pod \"certified-operators-q5pm5\" (UID: \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\") " pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.442927 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-utilities\") pod \"certified-operators-q5pm5\" (UID: \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\") " pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.443019 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-catalog-content\") pod \"certified-operators-q5pm5\" (UID: \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\") " pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.445660 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-catalog-content\") pod \"certified-operators-q5pm5\" (UID: \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\") " pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:04:55 crc kubenswrapper[4764]: E1001 16:04:55.445743 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:55.945728325 +0000 UTC m=+158.945375160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.447555 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-utilities\") pod \"certified-operators-q5pm5\" (UID: \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\") " pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.478586 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s98gc\" (UniqueName: \"kubernetes.io/projected/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-kube-api-access-s98gc\") pod \"certified-operators-q5pm5\" (UID: \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\") " pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.496692 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.520239 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.547513 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90a2cc50-3e21-4686-b89d-9263fd00cec7-utilities\") pod \"community-operators-62xnd\" (UID: \"90a2cc50-3e21-4686-b89d-9263fd00cec7\") " pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.547559 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90a2cc50-3e21-4686-b89d-9263fd00cec7-catalog-content\") pod \"community-operators-62xnd\" (UID: \"90a2cc50-3e21-4686-b89d-9263fd00cec7\") " pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.547607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.547640 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9md5\" (UniqueName: \"kubernetes.io/projected/90a2cc50-3e21-4686-b89d-9263fd00cec7-kube-api-access-f9md5\") pod \"community-operators-62xnd\" (UID: \"90a2cc50-3e21-4686-b89d-9263fd00cec7\") " pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:04:55 crc kubenswrapper[4764]: E1001 16:04:55.547982 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:56.047969134 +0000 UTC m=+159.047615969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.548579 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nmtgg" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.582210 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xmw68"] Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.583177 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.585339 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xmw68"] Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.698344 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:55 crc kubenswrapper[4764]: E1001 16:04:55.698463 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:56.198432761 +0000 UTC m=+159.198079596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.698640 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkkrn\" (UniqueName: \"kubernetes.io/projected/f954a6cc-bafb-431d-809f-3f20374f189d-kube-api-access-tkkrn\") pod \"certified-operators-xmw68\" (UID: \"f954a6cc-bafb-431d-809f-3f20374f189d\") " pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.698664 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f954a6cc-bafb-431d-809f-3f20374f189d-utilities\") pod \"certified-operators-xmw68\" (UID: \"f954a6cc-bafb-431d-809f-3f20374f189d\") " pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.698703 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.698742 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9md5\" (UniqueName: \"kubernetes.io/projected/90a2cc50-3e21-4686-b89d-9263fd00cec7-kube-api-access-f9md5\") pod \"community-operators-62xnd\" (UID: \"90a2cc50-3e21-4686-b89d-9263fd00cec7\") " pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.698839 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90a2cc50-3e21-4686-b89d-9263fd00cec7-utilities\") pod \"community-operators-62xnd\" (UID: \"90a2cc50-3e21-4686-b89d-9263fd00cec7\") " pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.698861 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f954a6cc-bafb-431d-809f-3f20374f189d-catalog-content\") pod \"certified-operators-xmw68\" (UID: \"f954a6cc-bafb-431d-809f-3f20374f189d\") " pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.698886 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90a2cc50-3e21-4686-b89d-9263fd00cec7-catalog-content\") pod \"community-operators-62xnd\" (UID: \"90a2cc50-3e21-4686-b89d-9263fd00cec7\") " pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:04:55 crc kubenswrapper[4764]: E1001 16:04:55.698926 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:56.198914912 +0000 UTC m=+159.198561747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.699272 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90a2cc50-3e21-4686-b89d-9263fd00cec7-catalog-content\") pod \"community-operators-62xnd\" (UID: \"90a2cc50-3e21-4686-b89d-9263fd00cec7\") " pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.699483 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90a2cc50-3e21-4686-b89d-9263fd00cec7-utilities\") pod \"community-operators-62xnd\" (UID: \"90a2cc50-3e21-4686-b89d-9263fd00cec7\") " pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.774354 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9md5\" (UniqueName: \"kubernetes.io/projected/90a2cc50-3e21-4686-b89d-9263fd00cec7-kube-api-access-f9md5\") pod \"community-operators-62xnd\" (UID: \"90a2cc50-3e21-4686-b89d-9263fd00cec7\") " pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.799494 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.799715 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f954a6cc-bafb-431d-809f-3f20374f189d-catalog-content\") pod \"certified-operators-xmw68\" (UID: \"f954a6cc-bafb-431d-809f-3f20374f189d\") " pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.799757 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkkrn\" (UniqueName: \"kubernetes.io/projected/f954a6cc-bafb-431d-809f-3f20374f189d-kube-api-access-tkkrn\") pod \"certified-operators-xmw68\" (UID: \"f954a6cc-bafb-431d-809f-3f20374f189d\") " pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.799772 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f954a6cc-bafb-431d-809f-3f20374f189d-utilities\") pod \"certified-operators-xmw68\" (UID: \"f954a6cc-bafb-431d-809f-3f20374f189d\") " pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.800249 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f954a6cc-bafb-431d-809f-3f20374f189d-utilities\") pod \"certified-operators-xmw68\" (UID: \"f954a6cc-bafb-431d-809f-3f20374f189d\") " pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.800342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f954a6cc-bafb-431d-809f-3f20374f189d-catalog-content\") pod \"certified-operators-xmw68\" (UID: \"f954a6cc-bafb-431d-809f-3f20374f189d\") " pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:04:55 crc kubenswrapper[4764]: E1001 16:04:55.800423 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:56.300406823 +0000 UTC m=+159.300053658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.841783 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkkrn\" (UniqueName: \"kubernetes.io/projected/f954a6cc-bafb-431d-809f-3f20374f189d-kube-api-access-tkkrn\") pod \"certified-operators-xmw68\" (UID: \"f954a6cc-bafb-431d-809f-3f20374f189d\") " pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.905662 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.906429 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:55 crc kubenswrapper[4764]: E1001 16:04:55.906699 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:56.406687223 +0000 UTC m=+159.406334058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:55 crc kubenswrapper[4764]: I1001 16:04:55.974073 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qzcqj"] Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.006926 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:56 crc kubenswrapper[4764]: E1001 16:04:56.007222 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:56.507203298 +0000 UTC m=+159.506850133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.042502 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.118441 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:56 crc kubenswrapper[4764]: E1001 16:04:56.118708 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:56.618697259 +0000 UTC m=+159.618344094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.190574 4764 patch_prober.go:28] interesting pod/router-default-5444994796-n4vzm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 16:04:56 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Oct 01 16:04:56 crc kubenswrapper[4764]: [+]process-running ok Oct 01 16:04:56 crc kubenswrapper[4764]: healthz check failed Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.190636 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4vzm" podUID="58b6f788-e88b-4751-b06d-3fb68a316f91" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.220148 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:56 crc kubenswrapper[4764]: E1001 16:04:56.220358 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:56.720332634 +0000 UTC m=+159.719979469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.308687 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzcqj" event={"ID":"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed","Type":"ContainerStarted","Data":"5491cd7c371fe063af2cbf0f9415c482b8a096aff01730324b2d3d2c684ac88f"} Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.308741 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzcqj" event={"ID":"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed","Type":"ContainerStarted","Data":"6626a0ceceec15cf1c3f12e4019c28446891c9a26f921f305a267b0b60da79c8"} Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.321586 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:56 crc kubenswrapper[4764]: E1001 16:04:56.321860 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:56.821847404 +0000 UTC m=+159.821494239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.346163 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" event={"ID":"fb9458af-3147-4e40-8816-2d50eeaee101","Type":"ContainerStarted","Data":"925adb436b0f91b6f04c97f98ea93c92d5cf6666a7805473d44e29c66fd9a48d"} Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.346227 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" event={"ID":"fb9458af-3147-4e40-8816-2d50eeaee101","Type":"ContainerStarted","Data":"906fc3d2888885308c0f56feabd924953f30b0b1cba5031a7ee3dc1b340dcb73"} Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.356223 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-frl64" Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.382447 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wcpvk" podStartSLOduration=9.38242883 podStartE2EDuration="9.38242883s" podCreationTimestamp="2025-10-01 16:04:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:56.380885031 +0000 UTC m=+159.380531887" watchObservedRunningTime="2025-10-01 16:04:56.38242883 +0000 UTC m=+159.382075665" Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.422222 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:56 crc kubenswrapper[4764]: E1001 16:04:56.422530 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:56.922503943 +0000 UTC m=+159.922150778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.493074 4764 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.523800 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:56 crc kubenswrapper[4764]: E1001 16:04:56.524115 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:57.024102746 +0000 UTC m=+160.023749581 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.560166 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xmw68"] Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.625157 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:56 crc kubenswrapper[4764]: E1001 16:04:56.625746 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:57.12572746 +0000 UTC m=+160.125374295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.637449 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-62xnd"] Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.730574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:56 crc kubenswrapper[4764]: E1001 16:04:56.730857 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:57.230845111 +0000 UTC m=+160.230491946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.787274 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.831953 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-secret-volume\") pod \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\" (UID: \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\") " Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.832016 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-config-volume\") pod \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\" (UID: \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\") " Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.832224 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.832288 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdt5h\" (UniqueName: \"kubernetes.io/projected/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-kube-api-access-qdt5h\") pod \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\" (UID: \"0ee3fedb-c6d1-421a-85f5-a46b964a47b7\") " Oct 01 16:04:56 crc kubenswrapper[4764]: E1001 16:04:56.832399 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 16:04:57.332372262 +0000 UTC m=+160.332019107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.832472 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.832737 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-config-volume" (OuterVolumeSpecName: "config-volume") pod "0ee3fedb-c6d1-421a-85f5-a46b964a47b7" (UID: "0ee3fedb-c6d1-421a-85f5-a46b964a47b7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:04:56 crc kubenswrapper[4764]: E1001 16:04:56.832796 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 16:04:57.332785622 +0000 UTC m=+160.332432517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m4gx9" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.837191 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-kube-api-access-qdt5h" (OuterVolumeSpecName: "kube-api-access-qdt5h") pod "0ee3fedb-c6d1-421a-85f5-a46b964a47b7" (UID: "0ee3fedb-c6d1-421a-85f5-a46b964a47b7"). InnerVolumeSpecName "kube-api-access-qdt5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.837219 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0ee3fedb-c6d1-421a-85f5-a46b964a47b7" (UID: "0ee3fedb-c6d1-421a-85f5-a46b964a47b7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.905007 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q5pm5"] Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.910002 4764 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-01T16:04:56.493263204Z","Handler":null,"Name":""} Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.916823 4764 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.916870 4764 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.933383 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.933701 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdt5h\" (UniqueName: \"kubernetes.io/projected/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-kube-api-access-qdt5h\") on node \"crc\" DevicePath \"\"" Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.933726 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.933742 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ee3fedb-c6d1-421a-85f5-a46b964a47b7-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 16:04:56 crc kubenswrapper[4764]: I1001 16:04:56.936565 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.034811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.054219 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.054258 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.114218 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m4gx9\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.151389 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tzspt"] Oct 01 16:04:57 crc kubenswrapper[4764]: E1001 16:04:57.151816 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee3fedb-c6d1-421a-85f5-a46b964a47b7" containerName="collect-profiles" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.151893 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee3fedb-c6d1-421a-85f5-a46b964a47b7" containerName="collect-profiles" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.152108 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ee3fedb-c6d1-421a-85f5-a46b964a47b7" containerName="collect-profiles" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.153039 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.154740 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.164117 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzspt"] Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.190572 4764 patch_prober.go:28] interesting pod/router-default-5444994796-n4vzm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 16:04:57 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Oct 01 16:04:57 crc kubenswrapper[4764]: [+]process-running ok Oct 01 16:04:57 crc kubenswrapper[4764]: healthz check failed Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.190622 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4vzm" podUID="58b6f788-e88b-4751-b06d-3fb68a316f91" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.237878 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2c1132f-fe3a-4e68-bc7d-0362951208ac-utilities\") pod \"redhat-marketplace-tzspt\" (UID: \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\") " pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.237914 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2c1132f-fe3a-4e68-bc7d-0362951208ac-catalog-content\") pod \"redhat-marketplace-tzspt\" (UID: \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\") " pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.237964 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8whlc\" (UniqueName: \"kubernetes.io/projected/c2c1132f-fe3a-4e68-bc7d-0362951208ac-kube-api-access-8whlc\") pod \"redhat-marketplace-tzspt\" (UID: \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\") " pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.339292 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2c1132f-fe3a-4e68-bc7d-0362951208ac-utilities\") pod \"redhat-marketplace-tzspt\" (UID: \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\") " pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.339347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2c1132f-fe3a-4e68-bc7d-0362951208ac-catalog-content\") pod \"redhat-marketplace-tzspt\" (UID: \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\") " pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.339433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8whlc\" (UniqueName: \"kubernetes.io/projected/c2c1132f-fe3a-4e68-bc7d-0362951208ac-kube-api-access-8whlc\") pod \"redhat-marketplace-tzspt\" (UID: \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\") " pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.339758 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2c1132f-fe3a-4e68-bc7d-0362951208ac-utilities\") pod \"redhat-marketplace-tzspt\" (UID: \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\") " pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.340150 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2c1132f-fe3a-4e68-bc7d-0362951208ac-catalog-content\") pod \"redhat-marketplace-tzspt\" (UID: \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\") " pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.344078 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.351242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" event={"ID":"0ee3fedb-c6d1-421a-85f5-a46b964a47b7","Type":"ContainerDied","Data":"183a166bbaa75a22c8a7ec90bd7dea4589b503cf15d961a9e26b0179bfffc798"} Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.351291 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="183a166bbaa75a22c8a7ec90bd7dea4589b503cf15d961a9e26b0179bfffc798" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.351302 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.353561 4764 generic.go:334] "Generic (PLEG): container finished" podID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" containerID="a9ccd932d4cdf51fb2de5cc71fe07e300f26630158df88bec64e506d439618c1" exitCode=0 Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.353656 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5pm5" event={"ID":"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea","Type":"ContainerDied","Data":"a9ccd932d4cdf51fb2de5cc71fe07e300f26630158df88bec64e506d439618c1"} Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.353688 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5pm5" event={"ID":"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea","Type":"ContainerStarted","Data":"3f9d70b11151ad87a2fa1b1d7d19cd5ab5d79281dfac6e1b38b738301de1bea0"} Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.355371 4764 generic.go:334] "Generic (PLEG): container finished" podID="90a2cc50-3e21-4686-b89d-9263fd00cec7" containerID="ec95a9ad146d40e71ddf44d473a4e9e4365501b2d788c88a7277ee093cc99eab" exitCode=0 Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.355421 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-62xnd" event={"ID":"90a2cc50-3e21-4686-b89d-9263fd00cec7","Type":"ContainerDied","Data":"ec95a9ad146d40e71ddf44d473a4e9e4365501b2d788c88a7277ee093cc99eab"} Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.355480 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-62xnd" event={"ID":"90a2cc50-3e21-4686-b89d-9263fd00cec7","Type":"ContainerStarted","Data":"a1088a456270b123f35467bbbf588766b862643031460ce1b28d1617f2088a2b"} Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.356958 4764 generic.go:334] "Generic (PLEG): container finished" podID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" containerID="5491cd7c371fe063af2cbf0f9415c482b8a096aff01730324b2d3d2c684ac88f" exitCode=0 Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.357102 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzcqj" event={"ID":"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed","Type":"ContainerDied","Data":"5491cd7c371fe063af2cbf0f9415c482b8a096aff01730324b2d3d2c684ac88f"} Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.358486 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.360449 4764 generic.go:334] "Generic (PLEG): container finished" podID="f954a6cc-bafb-431d-809f-3f20374f189d" containerID="05c5efdb10e62d3608751edc6e299ace89c2f68bc1c2db9ff4d277d1b00938f2" exitCode=0 Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.360598 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmw68" event={"ID":"f954a6cc-bafb-431d-809f-3f20374f189d","Type":"ContainerDied","Data":"05c5efdb10e62d3608751edc6e299ace89c2f68bc1c2db9ff4d277d1b00938f2"} Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.360633 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmw68" event={"ID":"f954a6cc-bafb-431d-809f-3f20374f189d","Type":"ContainerStarted","Data":"552159ec421242a3f1673a20fb9d146c0ef1f9b35b01287f89b6ec13b4dadcaa"} Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.363564 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8whlc\" (UniqueName: \"kubernetes.io/projected/c2c1132f-fe3a-4e68-bc7d-0362951208ac-kube-api-access-8whlc\") pod \"redhat-marketplace-tzspt\" (UID: \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\") " pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.539566 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.551086 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ngn7h"] Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.552020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.559849 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m4gx9"] Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.564312 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngn7h"] Oct 01 16:04:57 crc kubenswrapper[4764]: W1001 16:04:57.569088 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9e84d4e_1676_4857_ad1d_76ee4bcaf7c8.slice/crio-33a6b547337bbeac58b61e3a55156bc34839c387d6343a85b356652acd87b412 WatchSource:0}: Error finding container 33a6b547337bbeac58b61e3a55156bc34839c387d6343a85b356652acd87b412: Status 404 returned error can't find the container with id 33a6b547337bbeac58b61e3a55156bc34839c387d6343a85b356652acd87b412 Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.643023 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11e59851-5279-4f01-be7f-b9b17e85892a-utilities\") pod \"redhat-marketplace-ngn7h\" (UID: \"11e59851-5279-4f01-be7f-b9b17e85892a\") " pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.643140 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11e59851-5279-4f01-be7f-b9b17e85892a-catalog-content\") pod \"redhat-marketplace-ngn7h\" (UID: \"11e59851-5279-4f01-be7f-b9b17e85892a\") " pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.643270 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvfkg\" (UniqueName: \"kubernetes.io/projected/11e59851-5279-4f01-be7f-b9b17e85892a-kube-api-access-gvfkg\") pod \"redhat-marketplace-ngn7h\" (UID: \"11e59851-5279-4f01-be7f-b9b17e85892a\") " pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.716302 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzspt"] Oct 01 16:04:57 crc kubenswrapper[4764]: W1001 16:04:57.722592 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2c1132f_fe3a_4e68_bc7d_0362951208ac.slice/crio-f9a63b3d7da6e6caff0e91bd58a0a30f8b5db49b74e54025034b97711b6e21ea WatchSource:0}: Error finding container f9a63b3d7da6e6caff0e91bd58a0a30f8b5db49b74e54025034b97711b6e21ea: Status 404 returned error can't find the container with id f9a63b3d7da6e6caff0e91bd58a0a30f8b5db49b74e54025034b97711b6e21ea Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.744711 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11e59851-5279-4f01-be7f-b9b17e85892a-catalog-content\") pod \"redhat-marketplace-ngn7h\" (UID: \"11e59851-5279-4f01-be7f-b9b17e85892a\") " pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.745063 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvfkg\" (UniqueName: \"kubernetes.io/projected/11e59851-5279-4f01-be7f-b9b17e85892a-kube-api-access-gvfkg\") pod \"redhat-marketplace-ngn7h\" (UID: \"11e59851-5279-4f01-be7f-b9b17e85892a\") " pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.745116 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11e59851-5279-4f01-be7f-b9b17e85892a-utilities\") pod \"redhat-marketplace-ngn7h\" (UID: \"11e59851-5279-4f01-be7f-b9b17e85892a\") " pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.745234 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11e59851-5279-4f01-be7f-b9b17e85892a-catalog-content\") pod \"redhat-marketplace-ngn7h\" (UID: \"11e59851-5279-4f01-be7f-b9b17e85892a\") " pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.745493 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11e59851-5279-4f01-be7f-b9b17e85892a-utilities\") pod \"redhat-marketplace-ngn7h\" (UID: \"11e59851-5279-4f01-be7f-b9b17e85892a\") " pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.756415 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.767463 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvfkg\" (UniqueName: \"kubernetes.io/projected/11e59851-5279-4f01-be7f-b9b17e85892a-kube-api-access-gvfkg\") pod \"redhat-marketplace-ngn7h\" (UID: \"11e59851-5279-4f01-be7f-b9b17e85892a\") " pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:04:57 crc kubenswrapper[4764]: I1001 16:04:57.895432 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.015488 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.100468 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngn7h"] Oct 01 16:04:58 crc kubenswrapper[4764]: W1001 16:04:58.107982 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11e59851_5279_4f01_be7f_b9b17e85892a.slice/crio-4782bb7477a38a574cc651a3dad3024a5ca21ce2b16f68de1ec1c9ae18842dfa WatchSource:0}: Error finding container 4782bb7477a38a574cc651a3dad3024a5ca21ce2b16f68de1ec1c9ae18842dfa: Status 404 returned error can't find the container with id 4782bb7477a38a574cc651a3dad3024a5ca21ce2b16f68de1ec1c9ae18842dfa Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.163828 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dr8vb"] Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.167151 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.169015 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.179450 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dr8vb"] Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.193570 4764 patch_prober.go:28] interesting pod/router-default-5444994796-n4vzm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 16:04:58 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Oct 01 16:04:58 crc kubenswrapper[4764]: [+]process-running ok Oct 01 16:04:58 crc kubenswrapper[4764]: healthz check failed Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.193618 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4vzm" podUID="58b6f788-e88b-4751-b06d-3fb68a316f91" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.251240 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e169a786-1e32-414c-a44a-fa7afa25d04a-catalog-content\") pod \"redhat-operators-dr8vb\" (UID: \"e169a786-1e32-414c-a44a-fa7afa25d04a\") " pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.251338 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e169a786-1e32-414c-a44a-fa7afa25d04a-utilities\") pod \"redhat-operators-dr8vb\" (UID: \"e169a786-1e32-414c-a44a-fa7afa25d04a\") " pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.251359 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7czk\" (UniqueName: \"kubernetes.io/projected/e169a786-1e32-414c-a44a-fa7afa25d04a-kube-api-access-j7czk\") pod \"redhat-operators-dr8vb\" (UID: \"e169a786-1e32-414c-a44a-fa7afa25d04a\") " pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.275612 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.276282 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.280697 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.281037 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.281338 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.340163 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.340328 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.345091 4764 patch_prober.go:28] interesting pod/console-f9d7485db-pfzm8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.345302 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pfzm8" podUID="35ad23c6-6d86-4e4f-b642-336f47fe999c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.352242 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e169a786-1e32-414c-a44a-fa7afa25d04a-utilities\") pod \"redhat-operators-dr8vb\" (UID: \"e169a786-1e32-414c-a44a-fa7afa25d04a\") " pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.352274 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7czk\" (UniqueName: \"kubernetes.io/projected/e169a786-1e32-414c-a44a-fa7afa25d04a-kube-api-access-j7czk\") pod \"redhat-operators-dr8vb\" (UID: \"e169a786-1e32-414c-a44a-fa7afa25d04a\") " pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.352314 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e169a786-1e32-414c-a44a-fa7afa25d04a-catalog-content\") pod \"redhat-operators-dr8vb\" (UID: \"e169a786-1e32-414c-a44a-fa7afa25d04a\") " pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.353363 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e169a786-1e32-414c-a44a-fa7afa25d04a-utilities\") pod \"redhat-operators-dr8vb\" (UID: \"e169a786-1e32-414c-a44a-fa7afa25d04a\") " pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.353405 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e169a786-1e32-414c-a44a-fa7afa25d04a-catalog-content\") pod \"redhat-operators-dr8vb\" (UID: \"e169a786-1e32-414c-a44a-fa7afa25d04a\") " pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.370297 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7czk\" (UniqueName: \"kubernetes.io/projected/e169a786-1e32-414c-a44a-fa7afa25d04a-kube-api-access-j7czk\") pod \"redhat-operators-dr8vb\" (UID: \"e169a786-1e32-414c-a44a-fa7afa25d04a\") " pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.370553 4764 generic.go:334] "Generic (PLEG): container finished" podID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" containerID="8cdb8f32da9de2d7f84f9dab7e2c4f597c99fe90f71d782199f7df259225e9d5" exitCode=0 Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.370877 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzspt" event={"ID":"c2c1132f-fe3a-4e68-bc7d-0362951208ac","Type":"ContainerDied","Data":"8cdb8f32da9de2d7f84f9dab7e2c4f597c99fe90f71d782199f7df259225e9d5"} Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.370925 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzspt" event={"ID":"c2c1132f-fe3a-4e68-bc7d-0362951208ac","Type":"ContainerStarted","Data":"f9a63b3d7da6e6caff0e91bd58a0a30f8b5db49b74e54025034b97711b6e21ea"} Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.374360 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" event={"ID":"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8","Type":"ContainerStarted","Data":"b5b390ab2a2764d7c618507c77aeb49f0680efb7d1f9f79542ee0a163198ca72"} Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.374417 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" event={"ID":"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8","Type":"ContainerStarted","Data":"33a6b547337bbeac58b61e3a55156bc34839c387d6343a85b356652acd87b412"} Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.375228 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.378670 4764 generic.go:334] "Generic (PLEG): container finished" podID="11e59851-5279-4f01-be7f-b9b17e85892a" containerID="9a1aa9fc0a36e3b76c08f5ac15f5b40f19dd631c043dcfe834721e96178b8fbf" exitCode=0 Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.378737 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngn7h" event={"ID":"11e59851-5279-4f01-be7f-b9b17e85892a","Type":"ContainerDied","Data":"9a1aa9fc0a36e3b76c08f5ac15f5b40f19dd631c043dcfe834721e96178b8fbf"} Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.378787 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngn7h" event={"ID":"11e59851-5279-4f01-be7f-b9b17e85892a","Type":"ContainerStarted","Data":"4782bb7477a38a574cc651a3dad3024a5ca21ce2b16f68de1ec1c9ae18842dfa"} Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.405708 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" podStartSLOduration=136.405692151 podStartE2EDuration="2m16.405692151s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:04:58.405060406 +0000 UTC m=+161.404707241" watchObservedRunningTime="2025-10-01 16:04:58.405692151 +0000 UTC m=+161.405338986" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.433527 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.439007 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-6bg4z" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.462640 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cde2a409-91df-47a0-b77e-390d1effcc03-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cde2a409-91df-47a0-b77e-390d1effcc03\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.462718 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cde2a409-91df-47a0-b77e-390d1effcc03-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cde2a409-91df-47a0-b77e-390d1effcc03\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.507369 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.564694 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cde2a409-91df-47a0-b77e-390d1effcc03-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cde2a409-91df-47a0-b77e-390d1effcc03\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.564872 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cde2a409-91df-47a0-b77e-390d1effcc03-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cde2a409-91df-47a0-b77e-390d1effcc03\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.568880 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cde2a409-91df-47a0-b77e-390d1effcc03-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"cde2a409-91df-47a0-b77e-390d1effcc03\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.569660 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sbdlr"] Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.579847 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.616521 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sbdlr"] Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.626446 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cde2a409-91df-47a0-b77e-390d1effcc03-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"cde2a409-91df-47a0-b77e-390d1effcc03\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.667727 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e48889-43b2-4025-88f2-3ef8040c45e2-utilities\") pod \"redhat-operators-sbdlr\" (UID: \"c9e48889-43b2-4025-88f2-3ef8040c45e2\") " pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.667795 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e48889-43b2-4025-88f2-3ef8040c45e2-catalog-content\") pod \"redhat-operators-sbdlr\" (UID: \"c9e48889-43b2-4025-88f2-3ef8040c45e2\") " pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.667899 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm2rs\" (UniqueName: \"kubernetes.io/projected/c9e48889-43b2-4025-88f2-3ef8040c45e2-kube-api-access-hm2rs\") pod \"redhat-operators-sbdlr\" (UID: \"c9e48889-43b2-4025-88f2-3ef8040c45e2\") " pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.770349 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e48889-43b2-4025-88f2-3ef8040c45e2-catalog-content\") pod \"redhat-operators-sbdlr\" (UID: \"c9e48889-43b2-4025-88f2-3ef8040c45e2\") " pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.770459 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm2rs\" (UniqueName: \"kubernetes.io/projected/c9e48889-43b2-4025-88f2-3ef8040c45e2-kube-api-access-hm2rs\") pod \"redhat-operators-sbdlr\" (UID: \"c9e48889-43b2-4025-88f2-3ef8040c45e2\") " pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.770519 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e48889-43b2-4025-88f2-3ef8040c45e2-utilities\") pod \"redhat-operators-sbdlr\" (UID: \"c9e48889-43b2-4025-88f2-3ef8040c45e2\") " pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.771027 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e48889-43b2-4025-88f2-3ef8040c45e2-utilities\") pod \"redhat-operators-sbdlr\" (UID: \"c9e48889-43b2-4025-88f2-3ef8040c45e2\") " pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.771315 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e48889-43b2-4025-88f2-3ef8040c45e2-catalog-content\") pod \"redhat-operators-sbdlr\" (UID: \"c9e48889-43b2-4025-88f2-3ef8040c45e2\") " pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.814959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm2rs\" (UniqueName: \"kubernetes.io/projected/c9e48889-43b2-4025-88f2-3ef8040c45e2-kube-api-access-hm2rs\") pod \"redhat-operators-sbdlr\" (UID: \"c9e48889-43b2-4025-88f2-3ef8040c45e2\") " pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.836402 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dr8vb"] Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.898570 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 16:04:58 crc kubenswrapper[4764]: I1001 16:04:58.922597 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.191231 4764 patch_prober.go:28] interesting pod/router-default-5444994796-n4vzm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 16:04:59 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Oct 01 16:04:59 crc kubenswrapper[4764]: [+]process-running ok Oct 01 16:04:59 crc kubenswrapper[4764]: healthz check failed Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.191626 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4vzm" podUID="58b6f788-e88b-4751-b06d-3fb68a316f91" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.396599 4764 generic.go:334] "Generic (PLEG): container finished" podID="e169a786-1e32-414c-a44a-fa7afa25d04a" containerID="49834577eac686a04b3be23e2533323498b0d1292538cc4ec79b805ae94b8399" exitCode=0 Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.397116 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dr8vb" event={"ID":"e169a786-1e32-414c-a44a-fa7afa25d04a","Type":"ContainerDied","Data":"49834577eac686a04b3be23e2533323498b0d1292538cc4ec79b805ae94b8399"} Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.397347 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dr8vb" event={"ID":"e169a786-1e32-414c-a44a-fa7afa25d04a","Type":"ContainerStarted","Data":"0d810dabe9002c52028f54f763bdff8d77681f7749dc47275d156fec230057ca"} Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.466693 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.486375 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sbdlr"] Oct 01 16:04:59 crc kubenswrapper[4764]: W1001 16:04:59.527402 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcde2a409_91df_47a0_b77e_390d1effcc03.slice/crio-b925d10c7b4d7e1ec3f84029f04095acaf249934d363875d589e8847bb69f962 WatchSource:0}: Error finding container b925d10c7b4d7e1ec3f84029f04095acaf249934d363875d589e8847bb69f962: Status 404 returned error can't find the container with id b925d10c7b4d7e1ec3f84029f04095acaf249934d363875d589e8847bb69f962 Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.581786 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-5trjp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.581814 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-5trjp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.581833 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5trjp" podUID="ce380ecc-2685-4ceb-85f6-617c8f7c0eaa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.581867 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5trjp" podUID="ce380ecc-2685-4ceb-85f6-617c8f7c0eaa" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.611174 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.961155 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.963085 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.966551 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.966588 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 01 16:04:59 crc kubenswrapper[4764]: I1001 16:04:59.968117 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.116206 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78c743cf-db8f-468b-9c3a-0442af42b512-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"78c743cf-db8f-468b-9c3a-0442af42b512\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.116340 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78c743cf-db8f-468b-9c3a-0442af42b512-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"78c743cf-db8f-468b-9c3a-0442af42b512\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.187364 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.195490 4764 patch_prober.go:28] interesting pod/router-default-5444994796-n4vzm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 16:05:00 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Oct 01 16:05:00 crc kubenswrapper[4764]: [+]process-running ok Oct 01 16:05:00 crc kubenswrapper[4764]: healthz check failed Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.195537 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4vzm" podUID="58b6f788-e88b-4751-b06d-3fb68a316f91" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.217364 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78c743cf-db8f-468b-9c3a-0442af42b512-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"78c743cf-db8f-468b-9c3a-0442af42b512\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.217590 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78c743cf-db8f-468b-9c3a-0442af42b512-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"78c743cf-db8f-468b-9c3a-0442af42b512\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.217737 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78c743cf-db8f-468b-9c3a-0442af42b512-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"78c743cf-db8f-468b-9c3a-0442af42b512\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.241135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78c743cf-db8f-468b-9c3a-0442af42b512-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"78c743cf-db8f-468b-9c3a-0442af42b512\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.282782 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.423998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cde2a409-91df-47a0-b77e-390d1effcc03","Type":"ContainerStarted","Data":"4b1675be58c1be79021efd4e0ceab57409126ad8c2b917ce80174cb044195ad1"} Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.424075 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cde2a409-91df-47a0-b77e-390d1effcc03","Type":"ContainerStarted","Data":"b925d10c7b4d7e1ec3f84029f04095acaf249934d363875d589e8847bb69f962"} Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.462144 4764 generic.go:334] "Generic (PLEG): container finished" podID="c9e48889-43b2-4025-88f2-3ef8040c45e2" containerID="2eb240689897c215b036eecc54d34c59db8217765c73009d0a037191b9d7c48f" exitCode=0 Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.463346 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbdlr" event={"ID":"c9e48889-43b2-4025-88f2-3ef8040c45e2","Type":"ContainerDied","Data":"2eb240689897c215b036eecc54d34c59db8217765c73009d0a037191b9d7c48f"} Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.463367 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbdlr" event={"ID":"c9e48889-43b2-4025-88f2-3ef8040c45e2","Type":"ContainerStarted","Data":"363c3875aeec958806cb7c84b167375223acb5d2ba8d3d1529c9c05b5f393406"} Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.482199 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.482179674 podStartE2EDuration="2.482179674s" podCreationTimestamp="2025-10-01 16:04:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:05:00.455401354 +0000 UTC m=+163.455048209" watchObservedRunningTime="2025-10-01 16:05:00.482179674 +0000 UTC m=+163.481826509" Oct 01 16:05:00 crc kubenswrapper[4764]: I1001 16:05:00.887280 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 16:05:01 crc kubenswrapper[4764]: I1001 16:05:01.191412 4764 patch_prober.go:28] interesting pod/router-default-5444994796-n4vzm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 16:05:01 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Oct 01 16:05:01 crc kubenswrapper[4764]: [+]process-running ok Oct 01 16:05:01 crc kubenswrapper[4764]: healthz check failed Oct 01 16:05:01 crc kubenswrapper[4764]: I1001 16:05:01.191690 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4vzm" podUID="58b6f788-e88b-4751-b06d-3fb68a316f91" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 16:05:01 crc kubenswrapper[4764]: I1001 16:05:01.480452 4764 generic.go:334] "Generic (PLEG): container finished" podID="cde2a409-91df-47a0-b77e-390d1effcc03" containerID="4b1675be58c1be79021efd4e0ceab57409126ad8c2b917ce80174cb044195ad1" exitCode=0 Oct 01 16:05:01 crc kubenswrapper[4764]: I1001 16:05:01.480516 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cde2a409-91df-47a0-b77e-390d1effcc03","Type":"ContainerDied","Data":"4b1675be58c1be79021efd4e0ceab57409126ad8c2b917ce80174cb044195ad1"} Oct 01 16:05:01 crc kubenswrapper[4764]: I1001 16:05:01.489974 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78c743cf-db8f-468b-9c3a-0442af42b512","Type":"ContainerStarted","Data":"f36f6a794deeccd812a29bfaf2a038bac23c87fcdaf264976b1ebe191a0d20aa"} Oct 01 16:05:02 crc kubenswrapper[4764]: I1001 16:05:02.103537 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6kz6b" Oct 01 16:05:02 crc kubenswrapper[4764]: I1001 16:05:02.193732 4764 patch_prober.go:28] interesting pod/router-default-5444994796-n4vzm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 16:05:02 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Oct 01 16:05:02 crc kubenswrapper[4764]: [+]process-running ok Oct 01 16:05:02 crc kubenswrapper[4764]: healthz check failed Oct 01 16:05:02 crc kubenswrapper[4764]: I1001 16:05:02.194066 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n4vzm" podUID="58b6f788-e88b-4751-b06d-3fb68a316f91" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 16:05:02 crc kubenswrapper[4764]: I1001 16:05:02.511648 4764 generic.go:334] "Generic (PLEG): container finished" podID="78c743cf-db8f-468b-9c3a-0442af42b512" containerID="d3ab65d22dac9a6df36ef1f58d4e4eb0073ae8728e883cbed18a2a97096e9e8e" exitCode=0 Oct 01 16:05:02 crc kubenswrapper[4764]: I1001 16:05:02.511796 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78c743cf-db8f-468b-9c3a-0442af42b512","Type":"ContainerDied","Data":"d3ab65d22dac9a6df36ef1f58d4e4eb0073ae8728e883cbed18a2a97096e9e8e"} Oct 01 16:05:02 crc kubenswrapper[4764]: I1001 16:05:02.885484 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.002649 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cde2a409-91df-47a0-b77e-390d1effcc03-kubelet-dir\") pod \"cde2a409-91df-47a0-b77e-390d1effcc03\" (UID: \"cde2a409-91df-47a0-b77e-390d1effcc03\") " Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.002730 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cde2a409-91df-47a0-b77e-390d1effcc03-kube-api-access\") pod \"cde2a409-91df-47a0-b77e-390d1effcc03\" (UID: \"cde2a409-91df-47a0-b77e-390d1effcc03\") " Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.002792 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cde2a409-91df-47a0-b77e-390d1effcc03-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cde2a409-91df-47a0-b77e-390d1effcc03" (UID: "cde2a409-91df-47a0-b77e-390d1effcc03"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.002971 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cde2a409-91df-47a0-b77e-390d1effcc03-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.019263 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cde2a409-91df-47a0-b77e-390d1effcc03-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cde2a409-91df-47a0-b77e-390d1effcc03" (UID: "cde2a409-91df-47a0-b77e-390d1effcc03"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.104300 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cde2a409-91df-47a0-b77e-390d1effcc03-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.190543 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.193234 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-n4vzm" Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.521819 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.522436 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"cde2a409-91df-47a0-b77e-390d1effcc03","Type":"ContainerDied","Data":"b925d10c7b4d7e1ec3f84029f04095acaf249934d363875d589e8847bb69f962"} Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.522462 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b925d10c7b4d7e1ec3f84029f04095acaf249934d363875d589e8847bb69f962" Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.824232 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.917925 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78c743cf-db8f-468b-9c3a-0442af42b512-kube-api-access\") pod \"78c743cf-db8f-468b-9c3a-0442af42b512\" (UID: \"78c743cf-db8f-468b-9c3a-0442af42b512\") " Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.918021 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78c743cf-db8f-468b-9c3a-0442af42b512-kubelet-dir\") pod \"78c743cf-db8f-468b-9c3a-0442af42b512\" (UID: \"78c743cf-db8f-468b-9c3a-0442af42b512\") " Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.918279 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78c743cf-db8f-468b-9c3a-0442af42b512-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "78c743cf-db8f-468b-9c3a-0442af42b512" (UID: "78c743cf-db8f-468b-9c3a-0442af42b512"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:05:03 crc kubenswrapper[4764]: I1001 16:05:03.922414 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c743cf-db8f-468b-9c3a-0442af42b512-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "78c743cf-db8f-468b-9c3a-0442af42b512" (UID: "78c743cf-db8f-468b-9c3a-0442af42b512"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:05:04 crc kubenswrapper[4764]: I1001 16:05:04.019949 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78c743cf-db8f-468b-9c3a-0442af42b512-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 16:05:04 crc kubenswrapper[4764]: I1001 16:05:04.019980 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78c743cf-db8f-468b-9c3a-0442af42b512-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 16:05:04 crc kubenswrapper[4764]: I1001 16:05:04.551889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78c743cf-db8f-468b-9c3a-0442af42b512","Type":"ContainerDied","Data":"f36f6a794deeccd812a29bfaf2a038bac23c87fcdaf264976b1ebe191a0d20aa"} Oct 01 16:05:04 crc kubenswrapper[4764]: I1001 16:05:04.551940 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f36f6a794deeccd812a29bfaf2a038bac23c87fcdaf264976b1ebe191a0d20aa" Oct 01 16:05:04 crc kubenswrapper[4764]: I1001 16:05:04.551943 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 16:05:04 crc kubenswrapper[4764]: I1001 16:05:04.633485 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:05:04 crc kubenswrapper[4764]: I1001 16:05:04.639566 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41a0358d-ae10-4282-9423-8f3599adbc2a-metrics-certs\") pod \"network-metrics-daemon-btbfp\" (UID: \"41a0358d-ae10-4282-9423-8f3599adbc2a\") " pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:05:04 crc kubenswrapper[4764]: I1001 16:05:04.675716 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-btbfp" Oct 01 16:05:08 crc kubenswrapper[4764]: I1001 16:05:08.343787 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:05:08 crc kubenswrapper[4764]: I1001 16:05:08.347884 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:05:09 crc kubenswrapper[4764]: I1001 16:05:09.599191 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-5trjp" Oct 01 16:05:17 crc kubenswrapper[4764]: I1001 16:05:17.352373 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:05:21 crc kubenswrapper[4764]: I1001 16:05:21.914237 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:05:21 crc kubenswrapper[4764]: I1001 16:05:21.914670 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:05:25 crc kubenswrapper[4764]: I1001 16:05:25.957561 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 16:05:29 crc kubenswrapper[4764]: E1001 16:05:29.144904 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 16:05:29 crc kubenswrapper[4764]: E1001 16:05:29.145520 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xt464,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qzcqj_openshift-marketplace(6729eb41-7fe9-4e39-bffa-0c3180e2e6ed): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 16:05:29 crc kubenswrapper[4764]: E1001 16:05:29.147001 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qzcqj" podUID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" Oct 01 16:05:29 crc kubenswrapper[4764]: E1001 16:05:29.686916 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 16:05:29 crc kubenswrapper[4764]: E1001 16:05:29.687454 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8whlc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-tzspt_openshift-marketplace(c2c1132f-fe3a-4e68-bc7d-0362951208ac): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 16:05:29 crc kubenswrapper[4764]: E1001 16:05:29.688695 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-tzspt" podUID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" Oct 01 16:05:29 crc kubenswrapper[4764]: I1001 16:05:29.796497 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwnjx" Oct 01 16:05:31 crc kubenswrapper[4764]: E1001 16:05:31.332027 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-tzspt" podUID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" Oct 01 16:05:31 crc kubenswrapper[4764]: E1001 16:05:31.332083 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qzcqj" podUID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" Oct 01 16:05:31 crc kubenswrapper[4764]: E1001 16:05:31.507262 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 16:05:31 crc kubenswrapper[4764]: E1001 16:05:31.507424 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkkrn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-xmw68_openshift-marketplace(f954a6cc-bafb-431d-809f-3f20374f189d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 16:05:31 crc kubenswrapper[4764]: E1001 16:05:31.509493 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-xmw68" podUID="f954a6cc-bafb-431d-809f-3f20374f189d" Oct 01 16:05:31 crc kubenswrapper[4764]: E1001 16:05:31.646137 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 16:05:31 crc kubenswrapper[4764]: E1001 16:05:31.646313 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s98gc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-q5pm5_openshift-marketplace(a5e4a715-7c4f-42c0-a4a0-1de63c1482ea): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 16:05:31 crc kubenswrapper[4764]: E1001 16:05:31.647730 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-q5pm5" podUID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" Oct 01 16:05:34 crc kubenswrapper[4764]: E1001 16:05:34.187884 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-q5pm5" podUID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" Oct 01 16:05:34 crc kubenswrapper[4764]: E1001 16:05:34.187944 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-xmw68" podUID="f954a6cc-bafb-431d-809f-3f20374f189d" Oct 01 16:05:34 crc kubenswrapper[4764]: E1001 16:05:34.261946 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 16:05:34 crc kubenswrapper[4764]: E1001 16:05:34.262241 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gvfkg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ngn7h_openshift-marketplace(11e59851-5279-4f01-be7f-b9b17e85892a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 16:05:34 crc kubenswrapper[4764]: E1001 16:05:34.263439 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ngn7h" podUID="11e59851-5279-4f01-be7f-b9b17e85892a" Oct 01 16:05:37 crc kubenswrapper[4764]: E1001 16:05:37.039557 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ngn7h" podUID="11e59851-5279-4f01-be7f-b9b17e85892a" Oct 01 16:05:37 crc kubenswrapper[4764]: E1001 16:05:37.118139 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 16:05:37 crc kubenswrapper[4764]: E1001 16:05:37.118588 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hm2rs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-sbdlr_openshift-marketplace(c9e48889-43b2-4025-88f2-3ef8040c45e2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 16:05:37 crc kubenswrapper[4764]: E1001 16:05:37.120413 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-sbdlr" podUID="c9e48889-43b2-4025-88f2-3ef8040c45e2" Oct 01 16:05:37 crc kubenswrapper[4764]: E1001 16:05:37.170730 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 16:05:37 crc kubenswrapper[4764]: E1001 16:05:37.170877 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j7czk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dr8vb_openshift-marketplace(e169a786-1e32-414c-a44a-fa7afa25d04a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 16:05:37 crc kubenswrapper[4764]: E1001 16:05:37.172246 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-dr8vb" podUID="e169a786-1e32-414c-a44a-fa7afa25d04a" Oct 01 16:05:37 crc kubenswrapper[4764]: I1001 16:05:37.434306 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-btbfp"] Oct 01 16:05:37 crc kubenswrapper[4764]: I1001 16:05:37.730463 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-62xnd" event={"ID":"90a2cc50-3e21-4686-b89d-9263fd00cec7","Type":"ContainerStarted","Data":"e06b9aec3b4ef484166aeafac0e612f7079045c139860fc26499ebe7176db6af"} Oct 01 16:05:37 crc kubenswrapper[4764]: I1001 16:05:37.736940 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-btbfp" event={"ID":"41a0358d-ae10-4282-9423-8f3599adbc2a","Type":"ContainerStarted","Data":"ceb8d7c0b69b092714a08ea9b85b29d069a3d896e218cdda8e47fa293f4100db"} Oct 01 16:05:37 crc kubenswrapper[4764]: E1001 16:05:37.739723 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dr8vb" podUID="e169a786-1e32-414c-a44a-fa7afa25d04a" Oct 01 16:05:37 crc kubenswrapper[4764]: E1001 16:05:37.739773 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-sbdlr" podUID="c9e48889-43b2-4025-88f2-3ef8040c45e2" Oct 01 16:05:38 crc kubenswrapper[4764]: I1001 16:05:38.746188 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-btbfp" event={"ID":"41a0358d-ae10-4282-9423-8f3599adbc2a","Type":"ContainerStarted","Data":"eee7521bbc987fce400acda1ec6544309b84bc7e536a3a0018ec11d3d9c02c0d"} Oct 01 16:05:38 crc kubenswrapper[4764]: I1001 16:05:38.746277 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-btbfp" event={"ID":"41a0358d-ae10-4282-9423-8f3599adbc2a","Type":"ContainerStarted","Data":"d32b359543b225faa481bb3652d79094d193e09ae5e176ee239c85782615c4e2"} Oct 01 16:05:38 crc kubenswrapper[4764]: I1001 16:05:38.748592 4764 generic.go:334] "Generic (PLEG): container finished" podID="90a2cc50-3e21-4686-b89d-9263fd00cec7" containerID="e06b9aec3b4ef484166aeafac0e612f7079045c139860fc26499ebe7176db6af" exitCode=0 Oct 01 16:05:38 crc kubenswrapper[4764]: I1001 16:05:38.748648 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-62xnd" event={"ID":"90a2cc50-3e21-4686-b89d-9263fd00cec7","Type":"ContainerDied","Data":"e06b9aec3b4ef484166aeafac0e612f7079045c139860fc26499ebe7176db6af"} Oct 01 16:05:38 crc kubenswrapper[4764]: I1001 16:05:38.770122 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-btbfp" podStartSLOduration=176.770086095 podStartE2EDuration="2m56.770086095s" podCreationTimestamp="2025-10-01 16:02:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:05:38.766427993 +0000 UTC m=+201.766074838" watchObservedRunningTime="2025-10-01 16:05:38.770086095 +0000 UTC m=+201.769732930" Oct 01 16:05:39 crc kubenswrapper[4764]: I1001 16:05:39.756209 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-62xnd" event={"ID":"90a2cc50-3e21-4686-b89d-9263fd00cec7","Type":"ContainerStarted","Data":"c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe"} Oct 01 16:05:39 crc kubenswrapper[4764]: I1001 16:05:39.776336 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-62xnd" podStartSLOduration=2.9310442070000002 podStartE2EDuration="44.776318723s" podCreationTimestamp="2025-10-01 16:04:55 +0000 UTC" firstStartedPulling="2025-10-01 16:04:57.358257614 +0000 UTC m=+160.357904459" lastFinishedPulling="2025-10-01 16:05:39.20353213 +0000 UTC m=+202.203178975" observedRunningTime="2025-10-01 16:05:39.775853471 +0000 UTC m=+202.775500306" watchObservedRunningTime="2025-10-01 16:05:39.776318723 +0000 UTC m=+202.775965558" Oct 01 16:05:46 crc kubenswrapper[4764]: I1001 16:05:46.044478 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:05:46 crc kubenswrapper[4764]: I1001 16:05:46.045289 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:05:46 crc kubenswrapper[4764]: I1001 16:05:46.756173 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:05:46 crc kubenswrapper[4764]: I1001 16:05:46.839040 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:05:47 crc kubenswrapper[4764]: I1001 16:05:47.970229 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-62xnd"] Oct 01 16:05:48 crc kubenswrapper[4764]: I1001 16:05:48.803176 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-62xnd" podUID="90a2cc50-3e21-4686-b89d-9263fd00cec7" containerName="registry-server" containerID="cri-o://c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe" gracePeriod=2 Oct 01 16:05:49 crc kubenswrapper[4764]: E1001 16:05:49.448281 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90a2cc50_3e21_4686_b89d_9263fd00cec7.slice/crio-c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe.scope\": RecentStats: unable to find data in memory cache]" Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.780426 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.812876 4764 generic.go:334] "Generic (PLEG): container finished" podID="90a2cc50-3e21-4686-b89d-9263fd00cec7" containerID="c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe" exitCode=0 Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.813112 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-62xnd" event={"ID":"90a2cc50-3e21-4686-b89d-9263fd00cec7","Type":"ContainerDied","Data":"c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe"} Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.813229 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-62xnd" event={"ID":"90a2cc50-3e21-4686-b89d-9263fd00cec7","Type":"ContainerDied","Data":"a1088a456270b123f35467bbbf588766b862643031460ce1b28d1617f2088a2b"} Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.813187 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-62xnd" Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.813315 4764 scope.go:117] "RemoveContainer" containerID="c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe" Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.828260 4764 scope.go:117] "RemoveContainer" containerID="e06b9aec3b4ef484166aeafac0e612f7079045c139860fc26499ebe7176db6af" Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.853112 4764 scope.go:117] "RemoveContainer" containerID="ec95a9ad146d40e71ddf44d473a4e9e4365501b2d788c88a7277ee093cc99eab" Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.874423 4764 scope.go:117] "RemoveContainer" containerID="c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe" Oct 01 16:05:49 crc kubenswrapper[4764]: E1001 16:05:49.875173 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe\": container with ID starting with c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe not found: ID does not exist" containerID="c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe" Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.875331 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe"} err="failed to get container status \"c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe\": rpc error: code = NotFound desc = could not find container \"c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe\": container with ID starting with c54f7718aa82a1885c001bfe09a4deba7608a2209df11564da3d6320d86c4bbe not found: ID does not exist" Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.875480 4764 scope.go:117] "RemoveContainer" containerID="e06b9aec3b4ef484166aeafac0e612f7079045c139860fc26499ebe7176db6af" Oct 01 16:05:49 crc kubenswrapper[4764]: E1001 16:05:49.875918 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e06b9aec3b4ef484166aeafac0e612f7079045c139860fc26499ebe7176db6af\": container with ID starting with e06b9aec3b4ef484166aeafac0e612f7079045c139860fc26499ebe7176db6af not found: ID does not exist" containerID="e06b9aec3b4ef484166aeafac0e612f7079045c139860fc26499ebe7176db6af" Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.875966 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06b9aec3b4ef484166aeafac0e612f7079045c139860fc26499ebe7176db6af"} err="failed to get container status \"e06b9aec3b4ef484166aeafac0e612f7079045c139860fc26499ebe7176db6af\": rpc error: code = NotFound desc = could not find container \"e06b9aec3b4ef484166aeafac0e612f7079045c139860fc26499ebe7176db6af\": container with ID starting with e06b9aec3b4ef484166aeafac0e612f7079045c139860fc26499ebe7176db6af not found: ID does not exist" Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.876037 4764 scope.go:117] "RemoveContainer" containerID="ec95a9ad146d40e71ddf44d473a4e9e4365501b2d788c88a7277ee093cc99eab" Oct 01 16:05:49 crc kubenswrapper[4764]: E1001 16:05:49.876324 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec95a9ad146d40e71ddf44d473a4e9e4365501b2d788c88a7277ee093cc99eab\": container with ID starting with ec95a9ad146d40e71ddf44d473a4e9e4365501b2d788c88a7277ee093cc99eab not found: ID does not exist" containerID="ec95a9ad146d40e71ddf44d473a4e9e4365501b2d788c88a7277ee093cc99eab" Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.876427 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec95a9ad146d40e71ddf44d473a4e9e4365501b2d788c88a7277ee093cc99eab"} err="failed to get container status \"ec95a9ad146d40e71ddf44d473a4e9e4365501b2d788c88a7277ee093cc99eab\": rpc error: code = NotFound desc = could not find container \"ec95a9ad146d40e71ddf44d473a4e9e4365501b2d788c88a7277ee093cc99eab\": container with ID starting with ec95a9ad146d40e71ddf44d473a4e9e4365501b2d788c88a7277ee093cc99eab not found: ID does not exist" Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.952690 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90a2cc50-3e21-4686-b89d-9263fd00cec7-catalog-content\") pod \"90a2cc50-3e21-4686-b89d-9263fd00cec7\" (UID: \"90a2cc50-3e21-4686-b89d-9263fd00cec7\") " Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.952860 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9md5\" (UniqueName: \"kubernetes.io/projected/90a2cc50-3e21-4686-b89d-9263fd00cec7-kube-api-access-f9md5\") pod \"90a2cc50-3e21-4686-b89d-9263fd00cec7\" (UID: \"90a2cc50-3e21-4686-b89d-9263fd00cec7\") " Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.952943 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90a2cc50-3e21-4686-b89d-9263fd00cec7-utilities\") pod \"90a2cc50-3e21-4686-b89d-9263fd00cec7\" (UID: \"90a2cc50-3e21-4686-b89d-9263fd00cec7\") " Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.953896 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90a2cc50-3e21-4686-b89d-9263fd00cec7-utilities" (OuterVolumeSpecName: "utilities") pod "90a2cc50-3e21-4686-b89d-9263fd00cec7" (UID: "90a2cc50-3e21-4686-b89d-9263fd00cec7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:05:49 crc kubenswrapper[4764]: I1001 16:05:49.961506 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90a2cc50-3e21-4686-b89d-9263fd00cec7-kube-api-access-f9md5" (OuterVolumeSpecName: "kube-api-access-f9md5") pod "90a2cc50-3e21-4686-b89d-9263fd00cec7" (UID: "90a2cc50-3e21-4686-b89d-9263fd00cec7"). InnerVolumeSpecName "kube-api-access-f9md5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.006374 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90a2cc50-3e21-4686-b89d-9263fd00cec7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90a2cc50-3e21-4686-b89d-9263fd00cec7" (UID: "90a2cc50-3e21-4686-b89d-9263fd00cec7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.054578 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90a2cc50-3e21-4686-b89d-9263fd00cec7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.054614 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9md5\" (UniqueName: \"kubernetes.io/projected/90a2cc50-3e21-4686-b89d-9263fd00cec7-kube-api-access-f9md5\") on node \"crc\" DevicePath \"\"" Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.054633 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90a2cc50-3e21-4686-b89d-9263fd00cec7-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.237580 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-62xnd"] Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.241719 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-62xnd"] Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.818963 4764 generic.go:334] "Generic (PLEG): container finished" podID="f954a6cc-bafb-431d-809f-3f20374f189d" containerID="0e800de13b2f3fe4fdc88ea8cfde0bfd856d78b8943ab0b76c8310236ad42e2f" exitCode=0 Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.819068 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmw68" event={"ID":"f954a6cc-bafb-431d-809f-3f20374f189d","Type":"ContainerDied","Data":"0e800de13b2f3fe4fdc88ea8cfde0bfd856d78b8943ab0b76c8310236ad42e2f"} Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.821117 4764 generic.go:334] "Generic (PLEG): container finished" podID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" containerID="d5352cd76c2893dd6f9124969e6ae705333bfb7629e51c99850fc23f2dea56e1" exitCode=0 Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.821199 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzspt" event={"ID":"c2c1132f-fe3a-4e68-bc7d-0362951208ac","Type":"ContainerDied","Data":"d5352cd76c2893dd6f9124969e6ae705333bfb7629e51c99850fc23f2dea56e1"} Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.827765 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dr8vb" event={"ID":"e169a786-1e32-414c-a44a-fa7afa25d04a","Type":"ContainerStarted","Data":"6fa886904d6bb3d1f23d7d8143bd2fbeb926f33b48e2425b2da3367a0a9d9856"} Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.830197 4764 generic.go:334] "Generic (PLEG): container finished" podID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" containerID="9288a476be7273f86a7e81d0cc843a354f269ab4d8c460a94b31a13813e3edcb" exitCode=0 Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.830266 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5pm5" event={"ID":"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea","Type":"ContainerDied","Data":"9288a476be7273f86a7e81d0cc843a354f269ab4d8c460a94b31a13813e3edcb"} Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.832903 4764 generic.go:334] "Generic (PLEG): container finished" podID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" containerID="52f4de259d6270ced48829094f9148483bebbf984a386fdef6557599ad13ca4a" exitCode=0 Oct 01 16:05:50 crc kubenswrapper[4764]: I1001 16:05:50.832933 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzcqj" event={"ID":"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed","Type":"ContainerDied","Data":"52f4de259d6270ced48829094f9148483bebbf984a386fdef6557599ad13ca4a"} Oct 01 16:05:51 crc kubenswrapper[4764]: I1001 16:05:51.730631 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90a2cc50-3e21-4686-b89d-9263fd00cec7" path="/var/lib/kubelet/pods/90a2cc50-3e21-4686-b89d-9263fd00cec7/volumes" Oct 01 16:05:51 crc kubenswrapper[4764]: I1001 16:05:51.838398 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngn7h" event={"ID":"11e59851-5279-4f01-be7f-b9b17e85892a","Type":"ContainerStarted","Data":"3b0b27c31409088f8474db183c9730c7f5d9e5c8fa80eb5f17adf52c6ab3d3c6"} Oct 01 16:05:51 crc kubenswrapper[4764]: I1001 16:05:51.840461 4764 generic.go:334] "Generic (PLEG): container finished" podID="e169a786-1e32-414c-a44a-fa7afa25d04a" containerID="6fa886904d6bb3d1f23d7d8143bd2fbeb926f33b48e2425b2da3367a0a9d9856" exitCode=0 Oct 01 16:05:51 crc kubenswrapper[4764]: I1001 16:05:51.840501 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dr8vb" event={"ID":"e169a786-1e32-414c-a44a-fa7afa25d04a","Type":"ContainerDied","Data":"6fa886904d6bb3d1f23d7d8143bd2fbeb926f33b48e2425b2da3367a0a9d9856"} Oct 01 16:05:51 crc kubenswrapper[4764]: I1001 16:05:51.913800 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:05:51 crc kubenswrapper[4764]: I1001 16:05:51.913877 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:05:51 crc kubenswrapper[4764]: I1001 16:05:51.913931 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:05:51 crc kubenswrapper[4764]: I1001 16:05:51.914600 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:05:51 crc kubenswrapper[4764]: I1001 16:05:51.914678 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65" gracePeriod=600 Oct 01 16:05:52 crc kubenswrapper[4764]: I1001 16:05:52.865890 4764 generic.go:334] "Generic (PLEG): container finished" podID="11e59851-5279-4f01-be7f-b9b17e85892a" containerID="3b0b27c31409088f8474db183c9730c7f5d9e5c8fa80eb5f17adf52c6ab3d3c6" exitCode=0 Oct 01 16:05:52 crc kubenswrapper[4764]: I1001 16:05:52.866352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngn7h" event={"ID":"11e59851-5279-4f01-be7f-b9b17e85892a","Type":"ContainerDied","Data":"3b0b27c31409088f8474db183c9730c7f5d9e5c8fa80eb5f17adf52c6ab3d3c6"} Oct 01 16:05:52 crc kubenswrapper[4764]: I1001 16:05:52.871777 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65" exitCode=0 Oct 01 16:05:52 crc kubenswrapper[4764]: I1001 16:05:52.872114 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65"} Oct 01 16:05:56 crc kubenswrapper[4764]: I1001 16:05:56.896012 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5pm5" event={"ID":"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea","Type":"ContainerStarted","Data":"030c6cf18af52c2eff68129708c5ec799a3a024c654757257c5ec9ccb8cc0fb9"} Oct 01 16:05:56 crc kubenswrapper[4764]: I1001 16:05:56.897811 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"50ed732785efefc18292368412dcb52035a50e6aac0d6b7e5cfa2693eb204317"} Oct 01 16:05:56 crc kubenswrapper[4764]: I1001 16:05:56.916722 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q5pm5" podStartSLOduration=3.50971724 podStartE2EDuration="1m1.916702275s" podCreationTimestamp="2025-10-01 16:04:55 +0000 UTC" firstStartedPulling="2025-10-01 16:04:57.358291415 +0000 UTC m=+160.357938270" lastFinishedPulling="2025-10-01 16:05:55.76527646 +0000 UTC m=+218.764923305" observedRunningTime="2025-10-01 16:05:56.912742055 +0000 UTC m=+219.912388900" watchObservedRunningTime="2025-10-01 16:05:56.916702275 +0000 UTC m=+219.916349100" Oct 01 16:05:58 crc kubenswrapper[4764]: I1001 16:05:58.922824 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmw68" event={"ID":"f954a6cc-bafb-431d-809f-3f20374f189d","Type":"ContainerStarted","Data":"f1bd10a89d3e4fcf55f0aff3fe93df0beda8b1e6dbb2ebadc61b14a8df111a3a"} Oct 01 16:05:58 crc kubenswrapper[4764]: I1001 16:05:58.926210 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzspt" event={"ID":"c2c1132f-fe3a-4e68-bc7d-0362951208ac","Type":"ContainerStarted","Data":"13619eb6b7f53b9774b18b081a0f0cae6a88be2f362df52adc5542735c34e55d"} Oct 01 16:05:58 crc kubenswrapper[4764]: I1001 16:05:58.928285 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dr8vb" event={"ID":"e169a786-1e32-414c-a44a-fa7afa25d04a","Type":"ContainerStarted","Data":"bfd675fc6bd2eb86680605f0451f6991fcb9817a194a13e69424b776cbba52b5"} Oct 01 16:05:58 crc kubenswrapper[4764]: I1001 16:05:58.929741 4764 generic.go:334] "Generic (PLEG): container finished" podID="c9e48889-43b2-4025-88f2-3ef8040c45e2" containerID="ec000b95c20570cf98397261bec460128ecbac2166ae4e295907711c8f5ad6dd" exitCode=0 Oct 01 16:05:58 crc kubenswrapper[4764]: I1001 16:05:58.929810 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbdlr" event={"ID":"c9e48889-43b2-4025-88f2-3ef8040c45e2","Type":"ContainerDied","Data":"ec000b95c20570cf98397261bec460128ecbac2166ae4e295907711c8f5ad6dd"} Oct 01 16:05:58 crc kubenswrapper[4764]: I1001 16:05:58.933414 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngn7h" event={"ID":"11e59851-5279-4f01-be7f-b9b17e85892a","Type":"ContainerStarted","Data":"1d3b03bb8c96939b626deb2b7e0b6f4326dc773934acad42a755e9ef78ff9673"} Oct 01 16:05:58 crc kubenswrapper[4764]: I1001 16:05:58.936849 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzcqj" event={"ID":"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed","Type":"ContainerStarted","Data":"bf157f0d52df931228ee5a2b0988d6a1cd48a2a1c46b20b12945e3e9160ba90d"} Oct 01 16:05:58 crc kubenswrapper[4764]: I1001 16:05:58.957442 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xmw68" podStartSLOduration=3.495415987 podStartE2EDuration="1m3.957426127s" podCreationTimestamp="2025-10-01 16:04:55 +0000 UTC" firstStartedPulling="2025-10-01 16:04:57.363011024 +0000 UTC m=+160.362657899" lastFinishedPulling="2025-10-01 16:05:57.825021154 +0000 UTC m=+220.824668039" observedRunningTime="2025-10-01 16:05:58.956506124 +0000 UTC m=+221.956152959" watchObservedRunningTime="2025-10-01 16:05:58.957426127 +0000 UTC m=+221.957072962" Oct 01 16:05:58 crc kubenswrapper[4764]: I1001 16:05:58.978938 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ngn7h" podStartSLOduration=2.397834209 podStartE2EDuration="1m1.978920121s" podCreationTimestamp="2025-10-01 16:04:57 +0000 UTC" firstStartedPulling="2025-10-01 16:04:58.381443334 +0000 UTC m=+161.381090179" lastFinishedPulling="2025-10-01 16:05:57.962529256 +0000 UTC m=+220.962176091" observedRunningTime="2025-10-01 16:05:58.973995836 +0000 UTC m=+221.973642671" watchObservedRunningTime="2025-10-01 16:05:58.978920121 +0000 UTC m=+221.978566956" Oct 01 16:05:59 crc kubenswrapper[4764]: I1001 16:05:59.068304 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qzcqj" podStartSLOduration=4.616186876 podStartE2EDuration="1m5.068266463s" podCreationTimestamp="2025-10-01 16:04:54 +0000 UTC" firstStartedPulling="2025-10-01 16:04:57.358266745 +0000 UTC m=+160.357913570" lastFinishedPulling="2025-10-01 16:05:57.810346322 +0000 UTC m=+220.809993157" observedRunningTime="2025-10-01 16:05:59.064813306 +0000 UTC m=+222.064460151" watchObservedRunningTime="2025-10-01 16:05:59.068266463 +0000 UTC m=+222.067913298" Oct 01 16:05:59 crc kubenswrapper[4764]: I1001 16:05:59.098092 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dr8vb" podStartSLOduration=2.6573215340000003 podStartE2EDuration="1m1.098072418s" podCreationTimestamp="2025-10-01 16:04:58 +0000 UTC" firstStartedPulling="2025-10-01 16:04:59.399583477 +0000 UTC m=+162.399230302" lastFinishedPulling="2025-10-01 16:05:57.840334351 +0000 UTC m=+220.839981186" observedRunningTime="2025-10-01 16:05:59.097508823 +0000 UTC m=+222.097155658" watchObservedRunningTime="2025-10-01 16:05:59.098072418 +0000 UTC m=+222.097719263" Oct 01 16:05:59 crc kubenswrapper[4764]: I1001 16:05:59.114635 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tzspt" podStartSLOduration=2.700367042 podStartE2EDuration="1m2.114616616s" podCreationTimestamp="2025-10-01 16:04:57 +0000 UTC" firstStartedPulling="2025-10-01 16:04:58.37287889 +0000 UTC m=+161.372525735" lastFinishedPulling="2025-10-01 16:05:57.787128464 +0000 UTC m=+220.786775309" observedRunningTime="2025-10-01 16:05:59.113558 +0000 UTC m=+222.113204855" watchObservedRunningTime="2025-10-01 16:05:59.114616616 +0000 UTC m=+222.114263451" Oct 01 16:05:59 crc kubenswrapper[4764]: I1001 16:05:59.945090 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbdlr" event={"ID":"c9e48889-43b2-4025-88f2-3ef8040c45e2","Type":"ContainerStarted","Data":"48cbf5349198804e538656db8991af44b127907ee40af6c8eaeb53ebbd81259a"} Oct 01 16:05:59 crc kubenswrapper[4764]: I1001 16:05:59.966964 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sbdlr" podStartSLOduration=3.120704197 podStartE2EDuration="1m1.966943138s" podCreationTimestamp="2025-10-01 16:04:58 +0000 UTC" firstStartedPulling="2025-10-01 16:05:00.464146163 +0000 UTC m=+163.463792998" lastFinishedPulling="2025-10-01 16:05:59.310385104 +0000 UTC m=+222.310031939" observedRunningTime="2025-10-01 16:05:59.966131508 +0000 UTC m=+222.965778343" watchObservedRunningTime="2025-10-01 16:05:59.966943138 +0000 UTC m=+222.966589973" Oct 01 16:06:02 crc kubenswrapper[4764]: I1001 16:06:02.191699 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hn4rz"] Oct 01 16:06:05 crc kubenswrapper[4764]: I1001 16:06:05.344108 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:06:05 crc kubenswrapper[4764]: I1001 16:06:05.344475 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:06:05 crc kubenswrapper[4764]: I1001 16:06:05.387507 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:06:05 crc kubenswrapper[4764]: I1001 16:06:05.497410 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:06:05 crc kubenswrapper[4764]: I1001 16:06:05.497477 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:06:05 crc kubenswrapper[4764]: I1001 16:06:05.871098 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:06:05 crc kubenswrapper[4764]: I1001 16:06:05.907142 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:06:05 crc kubenswrapper[4764]: I1001 16:06:05.907404 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:06:05 crc kubenswrapper[4764]: I1001 16:06:05.947650 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:06:06 crc kubenswrapper[4764]: I1001 16:06:06.025124 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:06:06 crc kubenswrapper[4764]: I1001 16:06:06.027681 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:06:06 crc kubenswrapper[4764]: I1001 16:06:06.033347 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:06:06 crc kubenswrapper[4764]: I1001 16:06:06.568490 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xmw68"] Oct 01 16:06:07 crc kubenswrapper[4764]: I1001 16:06:07.539788 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:06:07 crc kubenswrapper[4764]: I1001 16:06:07.540111 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:06:07 crc kubenswrapper[4764]: I1001 16:06:07.580839 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:06:07 crc kubenswrapper[4764]: I1001 16:06:07.896037 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:06:07 crc kubenswrapper[4764]: I1001 16:06:07.896090 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:06:07 crc kubenswrapper[4764]: I1001 16:06:07.944988 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:06:07 crc kubenswrapper[4764]: I1001 16:06:07.992208 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xmw68" podUID="f954a6cc-bafb-431d-809f-3f20374f189d" containerName="registry-server" containerID="cri-o://f1bd10a89d3e4fcf55f0aff3fe93df0beda8b1e6dbb2ebadc61b14a8df111a3a" gracePeriod=2 Oct 01 16:06:08 crc kubenswrapper[4764]: I1001 16:06:08.033027 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:06:08 crc kubenswrapper[4764]: I1001 16:06:08.056176 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:06:08 crc kubenswrapper[4764]: I1001 16:06:08.509345 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:06:08 crc kubenswrapper[4764]: I1001 16:06:08.509404 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:06:08 crc kubenswrapper[4764]: I1001 16:06:08.580761 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:06:08 crc kubenswrapper[4764]: I1001 16:06:08.922762 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:06:08 crc kubenswrapper[4764]: I1001 16:06:08.922849 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:06:08 crc kubenswrapper[4764]: I1001 16:06:08.962182 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:06:09 crc kubenswrapper[4764]: I1001 16:06:09.031845 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:06:09 crc kubenswrapper[4764]: I1001 16:06:09.033592 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.002725 4764 generic.go:334] "Generic (PLEG): container finished" podID="f954a6cc-bafb-431d-809f-3f20374f189d" containerID="f1bd10a89d3e4fcf55f0aff3fe93df0beda8b1e6dbb2ebadc61b14a8df111a3a" exitCode=0 Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.003986 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmw68" event={"ID":"f954a6cc-bafb-431d-809f-3f20374f189d","Type":"ContainerDied","Data":"f1bd10a89d3e4fcf55f0aff3fe93df0beda8b1e6dbb2ebadc61b14a8df111a3a"} Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.367376 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngn7h"] Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.367671 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ngn7h" podUID="11e59851-5279-4f01-be7f-b9b17e85892a" containerName="registry-server" containerID="cri-o://1d3b03bb8c96939b626deb2b7e0b6f4326dc773934acad42a755e9ef78ff9673" gracePeriod=2 Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.479861 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.643836 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f954a6cc-bafb-431d-809f-3f20374f189d-catalog-content\") pod \"f954a6cc-bafb-431d-809f-3f20374f189d\" (UID: \"f954a6cc-bafb-431d-809f-3f20374f189d\") " Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.643911 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f954a6cc-bafb-431d-809f-3f20374f189d-utilities\") pod \"f954a6cc-bafb-431d-809f-3f20374f189d\" (UID: \"f954a6cc-bafb-431d-809f-3f20374f189d\") " Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.643980 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkkrn\" (UniqueName: \"kubernetes.io/projected/f954a6cc-bafb-431d-809f-3f20374f189d-kube-api-access-tkkrn\") pod \"f954a6cc-bafb-431d-809f-3f20374f189d\" (UID: \"f954a6cc-bafb-431d-809f-3f20374f189d\") " Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.644678 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f954a6cc-bafb-431d-809f-3f20374f189d-utilities" (OuterVolumeSpecName: "utilities") pod "f954a6cc-bafb-431d-809f-3f20374f189d" (UID: "f954a6cc-bafb-431d-809f-3f20374f189d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.650615 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f954a6cc-bafb-431d-809f-3f20374f189d-kube-api-access-tkkrn" (OuterVolumeSpecName: "kube-api-access-tkkrn") pod "f954a6cc-bafb-431d-809f-3f20374f189d" (UID: "f954a6cc-bafb-431d-809f-3f20374f189d"). InnerVolumeSpecName "kube-api-access-tkkrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.700016 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f954a6cc-bafb-431d-809f-3f20374f189d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f954a6cc-bafb-431d-809f-3f20374f189d" (UID: "f954a6cc-bafb-431d-809f-3f20374f189d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.745727 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f954a6cc-bafb-431d-809f-3f20374f189d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.745761 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f954a6cc-bafb-431d-809f-3f20374f189d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:10 crc kubenswrapper[4764]: I1001 16:06:10.745774 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkkrn\" (UniqueName: \"kubernetes.io/projected/f954a6cc-bafb-431d-809f-3f20374f189d-kube-api-access-tkkrn\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:11 crc kubenswrapper[4764]: I1001 16:06:11.010375 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xmw68" Oct 01 16:06:11 crc kubenswrapper[4764]: I1001 16:06:11.013093 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xmw68" event={"ID":"f954a6cc-bafb-431d-809f-3f20374f189d","Type":"ContainerDied","Data":"552159ec421242a3f1673a20fb9d146c0ef1f9b35b01287f89b6ec13b4dadcaa"} Oct 01 16:06:11 crc kubenswrapper[4764]: I1001 16:06:11.013139 4764 scope.go:117] "RemoveContainer" containerID="f1bd10a89d3e4fcf55f0aff3fe93df0beda8b1e6dbb2ebadc61b14a8df111a3a" Oct 01 16:06:11 crc kubenswrapper[4764]: I1001 16:06:11.029580 4764 scope.go:117] "RemoveContainer" containerID="0e800de13b2f3fe4fdc88ea8cfde0bfd856d78b8943ab0b76c8310236ad42e2f" Oct 01 16:06:11 crc kubenswrapper[4764]: I1001 16:06:11.041602 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xmw68"] Oct 01 16:06:11 crc kubenswrapper[4764]: I1001 16:06:11.045258 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xmw68"] Oct 01 16:06:11 crc kubenswrapper[4764]: I1001 16:06:11.052997 4764 scope.go:117] "RemoveContainer" containerID="05c5efdb10e62d3608751edc6e299ace89c2f68bc1c2db9ff4d277d1b00938f2" Oct 01 16:06:11 crc kubenswrapper[4764]: I1001 16:06:11.727749 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f954a6cc-bafb-431d-809f-3f20374f189d" path="/var/lib/kubelet/pods/f954a6cc-bafb-431d-809f-3f20374f189d/volumes" Oct 01 16:06:12 crc kubenswrapper[4764]: I1001 16:06:12.768334 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sbdlr"] Oct 01 16:06:12 crc kubenswrapper[4764]: I1001 16:06:12.768885 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sbdlr" podUID="c9e48889-43b2-4025-88f2-3ef8040c45e2" containerName="registry-server" containerID="cri-o://48cbf5349198804e538656db8991af44b127907ee40af6c8eaeb53ebbd81259a" gracePeriod=2 Oct 01 16:06:13 crc kubenswrapper[4764]: I1001 16:06:13.021413 4764 generic.go:334] "Generic (PLEG): container finished" podID="11e59851-5279-4f01-be7f-b9b17e85892a" containerID="1d3b03bb8c96939b626deb2b7e0b6f4326dc773934acad42a755e9ef78ff9673" exitCode=0 Oct 01 16:06:13 crc kubenswrapper[4764]: I1001 16:06:13.021455 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngn7h" event={"ID":"11e59851-5279-4f01-be7f-b9b17e85892a","Type":"ContainerDied","Data":"1d3b03bb8c96939b626deb2b7e0b6f4326dc773934acad42a755e9ef78ff9673"} Oct 01 16:06:13 crc kubenswrapper[4764]: I1001 16:06:13.527961 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:06:13 crc kubenswrapper[4764]: I1001 16:06:13.683960 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11e59851-5279-4f01-be7f-b9b17e85892a-utilities\") pod \"11e59851-5279-4f01-be7f-b9b17e85892a\" (UID: \"11e59851-5279-4f01-be7f-b9b17e85892a\") " Oct 01 16:06:13 crc kubenswrapper[4764]: I1001 16:06:13.684360 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11e59851-5279-4f01-be7f-b9b17e85892a-catalog-content\") pod \"11e59851-5279-4f01-be7f-b9b17e85892a\" (UID: \"11e59851-5279-4f01-be7f-b9b17e85892a\") " Oct 01 16:06:13 crc kubenswrapper[4764]: I1001 16:06:13.684431 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvfkg\" (UniqueName: \"kubernetes.io/projected/11e59851-5279-4f01-be7f-b9b17e85892a-kube-api-access-gvfkg\") pod \"11e59851-5279-4f01-be7f-b9b17e85892a\" (UID: \"11e59851-5279-4f01-be7f-b9b17e85892a\") " Oct 01 16:06:13 crc kubenswrapper[4764]: I1001 16:06:13.684844 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11e59851-5279-4f01-be7f-b9b17e85892a-utilities" (OuterVolumeSpecName: "utilities") pod "11e59851-5279-4f01-be7f-b9b17e85892a" (UID: "11e59851-5279-4f01-be7f-b9b17e85892a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:13 crc kubenswrapper[4764]: I1001 16:06:13.692040 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e59851-5279-4f01-be7f-b9b17e85892a-kube-api-access-gvfkg" (OuterVolumeSpecName: "kube-api-access-gvfkg") pod "11e59851-5279-4f01-be7f-b9b17e85892a" (UID: "11e59851-5279-4f01-be7f-b9b17e85892a"). InnerVolumeSpecName "kube-api-access-gvfkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:06:13 crc kubenswrapper[4764]: I1001 16:06:13.699028 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11e59851-5279-4f01-be7f-b9b17e85892a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11e59851-5279-4f01-be7f-b9b17e85892a" (UID: "11e59851-5279-4f01-be7f-b9b17e85892a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:13 crc kubenswrapper[4764]: I1001 16:06:13.785444 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11e59851-5279-4f01-be7f-b9b17e85892a-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:13 crc kubenswrapper[4764]: I1001 16:06:13.785477 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11e59851-5279-4f01-be7f-b9b17e85892a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:13 crc kubenswrapper[4764]: I1001 16:06:13.785489 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvfkg\" (UniqueName: \"kubernetes.io/projected/11e59851-5279-4f01-be7f-b9b17e85892a-kube-api-access-gvfkg\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.027735 4764 generic.go:334] "Generic (PLEG): container finished" podID="c9e48889-43b2-4025-88f2-3ef8040c45e2" containerID="48cbf5349198804e538656db8991af44b127907ee40af6c8eaeb53ebbd81259a" exitCode=0 Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.027800 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbdlr" event={"ID":"c9e48889-43b2-4025-88f2-3ef8040c45e2","Type":"ContainerDied","Data":"48cbf5349198804e538656db8991af44b127907ee40af6c8eaeb53ebbd81259a"} Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.030040 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ngn7h" event={"ID":"11e59851-5279-4f01-be7f-b9b17e85892a","Type":"ContainerDied","Data":"4782bb7477a38a574cc651a3dad3024a5ca21ce2b16f68de1ec1c9ae18842dfa"} Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.030110 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ngn7h" Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.030118 4764 scope.go:117] "RemoveContainer" containerID="1d3b03bb8c96939b626deb2b7e0b6f4326dc773934acad42a755e9ef78ff9673" Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.045749 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngn7h"] Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.047452 4764 scope.go:117] "RemoveContainer" containerID="3b0b27c31409088f8474db183c9730c7f5d9e5c8fa80eb5f17adf52c6ab3d3c6" Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.049311 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ngn7h"] Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.061317 4764 scope.go:117] "RemoveContainer" containerID="9a1aa9fc0a36e3b76c08f5ac15f5b40f19dd631c043dcfe834721e96178b8fbf" Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.630870 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.798263 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm2rs\" (UniqueName: \"kubernetes.io/projected/c9e48889-43b2-4025-88f2-3ef8040c45e2-kube-api-access-hm2rs\") pod \"c9e48889-43b2-4025-88f2-3ef8040c45e2\" (UID: \"c9e48889-43b2-4025-88f2-3ef8040c45e2\") " Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.798330 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e48889-43b2-4025-88f2-3ef8040c45e2-utilities\") pod \"c9e48889-43b2-4025-88f2-3ef8040c45e2\" (UID: \"c9e48889-43b2-4025-88f2-3ef8040c45e2\") " Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.798387 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e48889-43b2-4025-88f2-3ef8040c45e2-catalog-content\") pod \"c9e48889-43b2-4025-88f2-3ef8040c45e2\" (UID: \"c9e48889-43b2-4025-88f2-3ef8040c45e2\") " Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.799375 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e48889-43b2-4025-88f2-3ef8040c45e2-utilities" (OuterVolumeSpecName: "utilities") pod "c9e48889-43b2-4025-88f2-3ef8040c45e2" (UID: "c9e48889-43b2-4025-88f2-3ef8040c45e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.803272 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e48889-43b2-4025-88f2-3ef8040c45e2-kube-api-access-hm2rs" (OuterVolumeSpecName: "kube-api-access-hm2rs") pod "c9e48889-43b2-4025-88f2-3ef8040c45e2" (UID: "c9e48889-43b2-4025-88f2-3ef8040c45e2"). InnerVolumeSpecName "kube-api-access-hm2rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.890394 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e48889-43b2-4025-88f2-3ef8040c45e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9e48889-43b2-4025-88f2-3ef8040c45e2" (UID: "c9e48889-43b2-4025-88f2-3ef8040c45e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.899412 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm2rs\" (UniqueName: \"kubernetes.io/projected/c9e48889-43b2-4025-88f2-3ef8040c45e2-kube-api-access-hm2rs\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.899453 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e48889-43b2-4025-88f2-3ef8040c45e2-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:14 crc kubenswrapper[4764]: I1001 16:06:14.899463 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e48889-43b2-4025-88f2-3ef8040c45e2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:15 crc kubenswrapper[4764]: I1001 16:06:15.038102 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sbdlr" Oct 01 16:06:15 crc kubenswrapper[4764]: I1001 16:06:15.038038 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sbdlr" event={"ID":"c9e48889-43b2-4025-88f2-3ef8040c45e2","Type":"ContainerDied","Data":"363c3875aeec958806cb7c84b167375223acb5d2ba8d3d1529c9c05b5f393406"} Oct 01 16:06:15 crc kubenswrapper[4764]: I1001 16:06:15.038250 4764 scope.go:117] "RemoveContainer" containerID="48cbf5349198804e538656db8991af44b127907ee40af6c8eaeb53ebbd81259a" Oct 01 16:06:15 crc kubenswrapper[4764]: I1001 16:06:15.057324 4764 scope.go:117] "RemoveContainer" containerID="ec000b95c20570cf98397261bec460128ecbac2166ae4e295907711c8f5ad6dd" Oct 01 16:06:15 crc kubenswrapper[4764]: I1001 16:06:15.067234 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sbdlr"] Oct 01 16:06:15 crc kubenswrapper[4764]: I1001 16:06:15.073016 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sbdlr"] Oct 01 16:06:15 crc kubenswrapper[4764]: I1001 16:06:15.077263 4764 scope.go:117] "RemoveContainer" containerID="2eb240689897c215b036eecc54d34c59db8217765c73009d0a037191b9d7c48f" Oct 01 16:06:15 crc kubenswrapper[4764]: I1001 16:06:15.728556 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11e59851-5279-4f01-be7f-b9b17e85892a" path="/var/lib/kubelet/pods/11e59851-5279-4f01-be7f-b9b17e85892a/volumes" Oct 01 16:06:15 crc kubenswrapper[4764]: I1001 16:06:15.729224 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e48889-43b2-4025-88f2-3ef8040c45e2" path="/var/lib/kubelet/pods/c9e48889-43b2-4025-88f2-3ef8040c45e2/volumes" Oct 01 16:06:27 crc kubenswrapper[4764]: I1001 16:06:27.217916 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" podUID="9cc0e821-77ea-4840-be3d-1165904bf50d" containerName="oauth-openshift" containerID="cri-o://f693810b515f1fe6b8992ac37ecedaf5ba563074cb49d9e554867336f4de894f" gracePeriod=15 Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.099997 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.109975 4764 generic.go:334] "Generic (PLEG): container finished" podID="9cc0e821-77ea-4840-be3d-1165904bf50d" containerID="f693810b515f1fe6b8992ac37ecedaf5ba563074cb49d9e554867336f4de894f" exitCode=0 Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.110016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" event={"ID":"9cc0e821-77ea-4840-be3d-1165904bf50d","Type":"ContainerDied","Data":"f693810b515f1fe6b8992ac37ecedaf5ba563074cb49d9e554867336f4de894f"} Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.110063 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.110078 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" event={"ID":"9cc0e821-77ea-4840-be3d-1165904bf50d","Type":"ContainerDied","Data":"9f7f0431c4b4cf35921600b65b7eb4ec151e735e08ec8d7cf2a665a196ca2ebc"} Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.110097 4764 scope.go:117] "RemoveContainer" containerID="f693810b515f1fe6b8992ac37ecedaf5ba563074cb49d9e554867336f4de894f" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.134556 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt"] Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.134839 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a2cc50-3e21-4686-b89d-9263fd00cec7" containerName="extract-content" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.134866 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a2cc50-3e21-4686-b89d-9263fd00cec7" containerName="extract-content" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.134881 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc0e821-77ea-4840-be3d-1165904bf50d" containerName="oauth-openshift" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.134889 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc0e821-77ea-4840-be3d-1165904bf50d" containerName="oauth-openshift" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.134901 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e48889-43b2-4025-88f2-3ef8040c45e2" containerName="registry-server" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.134908 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e48889-43b2-4025-88f2-3ef8040c45e2" containerName="registry-server" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.134918 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f954a6cc-bafb-431d-809f-3f20374f189d" containerName="registry-server" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.134926 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f954a6cc-bafb-431d-809f-3f20374f189d" containerName="registry-server" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.134939 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a2cc50-3e21-4686-b89d-9263fd00cec7" containerName="registry-server" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.134947 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a2cc50-3e21-4686-b89d-9263fd00cec7" containerName="registry-server" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.134988 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e59851-5279-4f01-be7f-b9b17e85892a" containerName="registry-server" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.134998 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e59851-5279-4f01-be7f-b9b17e85892a" containerName="registry-server" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.135011 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e59851-5279-4f01-be7f-b9b17e85892a" containerName="extract-utilities" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.135019 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e59851-5279-4f01-be7f-b9b17e85892a" containerName="extract-utilities" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.135027 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e59851-5279-4f01-be7f-b9b17e85892a" containerName="extract-content" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.135034 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e59851-5279-4f01-be7f-b9b17e85892a" containerName="extract-content" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.135060 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde2a409-91df-47a0-b77e-390d1effcc03" containerName="pruner" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.135068 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde2a409-91df-47a0-b77e-390d1effcc03" containerName="pruner" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.135078 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c743cf-db8f-468b-9c3a-0442af42b512" containerName="pruner" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.135085 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c743cf-db8f-468b-9c3a-0442af42b512" containerName="pruner" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.135094 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f954a6cc-bafb-431d-809f-3f20374f189d" containerName="extract-utilities" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.135102 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f954a6cc-bafb-431d-809f-3f20374f189d" containerName="extract-utilities" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.135113 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e48889-43b2-4025-88f2-3ef8040c45e2" containerName="extract-utilities" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.135120 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e48889-43b2-4025-88f2-3ef8040c45e2" containerName="extract-utilities" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.135129 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f954a6cc-bafb-431d-809f-3f20374f189d" containerName="extract-content" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.135137 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f954a6cc-bafb-431d-809f-3f20374f189d" containerName="extract-content" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.135146 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e48889-43b2-4025-88f2-3ef8040c45e2" containerName="extract-content" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.135154 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e48889-43b2-4025-88f2-3ef8040c45e2" containerName="extract-content" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.135168 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90a2cc50-3e21-4686-b89d-9263fd00cec7" containerName="extract-utilities" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.135176 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="90a2cc50-3e21-4686-b89d-9263fd00cec7" containerName="extract-utilities" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.135906 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c743cf-db8f-468b-9c3a-0442af42b512" containerName="pruner" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.135926 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="90a2cc50-3e21-4686-b89d-9263fd00cec7" containerName="registry-server" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.135941 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f954a6cc-bafb-431d-809f-3f20374f189d" containerName="registry-server" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.135951 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde2a409-91df-47a0-b77e-390d1effcc03" containerName="pruner" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.136002 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e48889-43b2-4025-88f2-3ef8040c45e2" containerName="registry-server" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.136014 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc0e821-77ea-4840-be3d-1165904bf50d" containerName="oauth-openshift" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.136022 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="11e59851-5279-4f01-be7f-b9b17e85892a" containerName="registry-server" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.139855 4764 scope.go:117] "RemoveContainer" containerID="f693810b515f1fe6b8992ac37ecedaf5ba563074cb49d9e554867336f4de894f" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.139911 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: E1001 16:06:28.140371 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f693810b515f1fe6b8992ac37ecedaf5ba563074cb49d9e554867336f4de894f\": container with ID starting with f693810b515f1fe6b8992ac37ecedaf5ba563074cb49d9e554867336f4de894f not found: ID does not exist" containerID="f693810b515f1fe6b8992ac37ecedaf5ba563074cb49d9e554867336f4de894f" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.140443 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f693810b515f1fe6b8992ac37ecedaf5ba563074cb49d9e554867336f4de894f"} err="failed to get container status \"f693810b515f1fe6b8992ac37ecedaf5ba563074cb49d9e554867336f4de894f\": rpc error: code = NotFound desc = could not find container \"f693810b515f1fe6b8992ac37ecedaf5ba563074cb49d9e554867336f4de894f\": container with ID starting with f693810b515f1fe6b8992ac37ecedaf5ba563074cb49d9e554867336f4de894f not found: ID does not exist" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.144932 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt"] Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.169737 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-session\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.170772 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4j5s\" (UniqueName: \"kubernetes.io/projected/9cc0e821-77ea-4840-be3d-1165904bf50d-kube-api-access-z4j5s\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.170806 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-serving-cert\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.170826 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-provider-selection\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.170858 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-idp-0-file-data\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.170880 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9cc0e821-77ea-4840-be3d-1165904bf50d-audit-dir\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.170896 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-cliconfig\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.170926 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-audit-policies\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.170949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-trusted-ca-bundle\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.170969 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-router-certs\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171035 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-ocp-branding-template\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171127 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-login\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171146 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-error\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171169 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-service-ca\") pod \"9cc0e821-77ea-4840-be3d-1165904bf50d\" (UID: \"9cc0e821-77ea-4840-be3d-1165904bf50d\") " Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/491e363b-813f-4871-8041-c949eee799a3-audit-policies\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171304 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-session\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171324 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171364 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171385 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171436 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171466 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171525 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171552 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171595 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqnzs\" (UniqueName: \"kubernetes.io/projected/491e363b-813f-4871-8041-c949eee799a3-kube-api-access-dqnzs\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171623 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171686 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171716 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/491e363b-813f-4871-8041-c949eee799a3-audit-dir\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.171735 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.173062 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.173524 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cc0e821-77ea-4840-be3d-1165904bf50d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.174059 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.174321 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.174318 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.176770 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc0e821-77ea-4840-be3d-1165904bf50d-kube-api-access-z4j5s" (OuterVolumeSpecName: "kube-api-access-z4j5s") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "kube-api-access-z4j5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.176889 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.176951 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.177095 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.177384 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.177509 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.177671 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.177805 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.178129 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9cc0e821-77ea-4840-be3d-1165904bf50d" (UID: "9cc0e821-77ea-4840-be3d-1165904bf50d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.272974 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273064 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273103 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273146 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273164 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqnzs\" (UniqueName: \"kubernetes.io/projected/491e363b-813f-4871-8041-c949eee799a3-kube-api-access-dqnzs\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273239 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273265 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273327 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/491e363b-813f-4871-8041-c949eee799a3-audit-dir\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273437 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273467 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/491e363b-813f-4871-8041-c949eee799a3-audit-policies\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-session\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273529 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273585 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273672 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4j5s\" (UniqueName: \"kubernetes.io/projected/9cc0e821-77ea-4840-be3d-1165904bf50d-kube-api-access-z4j5s\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273683 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273695 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273705 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273715 4764 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9cc0e821-77ea-4840-be3d-1165904bf50d-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273725 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273734 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273745 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273754 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273789 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273799 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273814 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273833 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273846 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9cc0e821-77ea-4840-be3d-1165904bf50d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.273736 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/491e363b-813f-4871-8041-c949eee799a3-audit-dir\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.274276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.274658 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.275177 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.275227 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/491e363b-813f-4871-8041-c949eee799a3-audit-policies\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.277131 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.277633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-session\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.277645 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.277866 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.278573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.278817 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.280339 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.281263 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/491e363b-813f-4871-8041-c949eee799a3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.296097 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqnzs\" (UniqueName: \"kubernetes.io/projected/491e363b-813f-4871-8041-c949eee799a3-kube-api-access-dqnzs\") pod \"oauth-openshift-5d4f55d7c5-45bjt\" (UID: \"491e363b-813f-4871-8041-c949eee799a3\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.443476 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hn4rz"] Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.445624 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hn4rz"] Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.459333 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:28 crc kubenswrapper[4764]: I1001 16:06:28.871037 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt"] Oct 01 16:06:29 crc kubenswrapper[4764]: I1001 16:06:29.024074 4764 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-hn4rz container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 16:06:29 crc kubenswrapper[4764]: I1001 16:06:29.024414 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-hn4rz" podUID="9cc0e821-77ea-4840-be3d-1165904bf50d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 16:06:29 crc kubenswrapper[4764]: I1001 16:06:29.118249 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" event={"ID":"491e363b-813f-4871-8041-c949eee799a3","Type":"ContainerStarted","Data":"06d79319fa484888e4448407ea1733ebd45f6e12cc09081b333926dbadb570bf"} Oct 01 16:06:29 crc kubenswrapper[4764]: I1001 16:06:29.744542 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc0e821-77ea-4840-be3d-1165904bf50d" path="/var/lib/kubelet/pods/9cc0e821-77ea-4840-be3d-1165904bf50d/volumes" Oct 01 16:06:30 crc kubenswrapper[4764]: I1001 16:06:30.125143 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" event={"ID":"491e363b-813f-4871-8041-c949eee799a3","Type":"ContainerStarted","Data":"569d8c570ddb4018b1fe1d886570176e09fbec664c15bd1112ebef2eec748a79"} Oct 01 16:06:30 crc kubenswrapper[4764]: I1001 16:06:30.125504 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:30 crc kubenswrapper[4764]: I1001 16:06:30.130833 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" Oct 01 16:06:30 crc kubenswrapper[4764]: I1001 16:06:30.168978 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-45bjt" podStartSLOduration=28.168954725 podStartE2EDuration="28.168954725s" podCreationTimestamp="2025-10-01 16:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:06:30.146206449 +0000 UTC m=+253.145853284" watchObservedRunningTime="2025-10-01 16:06:30.168954725 +0000 UTC m=+253.168601560" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.205839 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q5pm5"] Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.206603 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qzcqj"] Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.207006 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qzcqj" podUID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" containerName="registry-server" containerID="cri-o://bf157f0d52df931228ee5a2b0988d6a1cd48a2a1c46b20b12945e3e9160ba90d" gracePeriod=30 Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.207420 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q5pm5" podUID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" containerName="registry-server" containerID="cri-o://030c6cf18af52c2eff68129708c5ec799a3a024c654757257c5ec9ccb8cc0fb9" gracePeriod=30 Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.222871 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cd5qk"] Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.223455 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" podUID="c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2" containerName="marketplace-operator" containerID="cri-o://7065957328a7799ae3dcef5ecdd3eb2a47668cac2b9b2fe1ad3aec93514e950f" gracePeriod=30 Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.236804 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzspt"] Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.237147 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tzspt" podUID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" containerName="registry-server" containerID="cri-o://13619eb6b7f53b9774b18b081a0f0cae6a88be2f362df52adc5542735c34e55d" gracePeriod=30 Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.243707 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xdv29"] Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.244374 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.257782 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dr8vb"] Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.259196 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dr8vb" podUID="e169a786-1e32-414c-a44a-fa7afa25d04a" containerName="registry-server" containerID="cri-o://bfd675fc6bd2eb86680605f0451f6991fcb9817a194a13e69424b776cbba52b5" gracePeriod=30 Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.261130 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xdv29"] Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.338017 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/176eab78-eb2e-4612-994f-d13d95e6c80d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xdv29\" (UID: \"176eab78-eb2e-4612-994f-d13d95e6c80d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.338129 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw985\" (UniqueName: \"kubernetes.io/projected/176eab78-eb2e-4612-994f-d13d95e6c80d-kube-api-access-xw985\") pod \"marketplace-operator-79b997595-xdv29\" (UID: \"176eab78-eb2e-4612-994f-d13d95e6c80d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.338167 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/176eab78-eb2e-4612-994f-d13d95e6c80d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xdv29\" (UID: \"176eab78-eb2e-4612-994f-d13d95e6c80d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.439516 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/176eab78-eb2e-4612-994f-d13d95e6c80d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xdv29\" (UID: \"176eab78-eb2e-4612-994f-d13d95e6c80d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.439568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw985\" (UniqueName: \"kubernetes.io/projected/176eab78-eb2e-4612-994f-d13d95e6c80d-kube-api-access-xw985\") pod \"marketplace-operator-79b997595-xdv29\" (UID: \"176eab78-eb2e-4612-994f-d13d95e6c80d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.439586 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/176eab78-eb2e-4612-994f-d13d95e6c80d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xdv29\" (UID: \"176eab78-eb2e-4612-994f-d13d95e6c80d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.442928 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/176eab78-eb2e-4612-994f-d13d95e6c80d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xdv29\" (UID: \"176eab78-eb2e-4612-994f-d13d95e6c80d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.446313 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/176eab78-eb2e-4612-994f-d13d95e6c80d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xdv29\" (UID: \"176eab78-eb2e-4612-994f-d13d95e6c80d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.462713 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw985\" (UniqueName: \"kubernetes.io/projected/176eab78-eb2e-4612-994f-d13d95e6c80d-kube-api-access-xw985\") pod \"marketplace-operator-79b997595-xdv29\" (UID: \"176eab78-eb2e-4612-994f-d13d95e6c80d\") " pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.579146 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.685543 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.702630 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.770202 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.776658 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.800988 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.854274 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt464\" (UniqueName: \"kubernetes.io/projected/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-kube-api-access-xt464\") pod \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\" (UID: \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.854329 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8qwk\" (UniqueName: \"kubernetes.io/projected/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-kube-api-access-q8qwk\") pod \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\" (UID: \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.854381 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-marketplace-operator-metrics\") pod \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\" (UID: \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.854531 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-utilities\") pod \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\" (UID: \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.854552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-catalog-content\") pod \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\" (UID: \"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.854574 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-marketplace-trusted-ca\") pod \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\" (UID: \"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.855323 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-utilities" (OuterVolumeSpecName: "utilities") pod "6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" (UID: "6729eb41-7fe9-4e39-bffa-0c3180e2e6ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.855808 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2" (UID: "c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.858543 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-kube-api-access-q8qwk" (OuterVolumeSpecName: "kube-api-access-q8qwk") pod "c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2" (UID: "c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2"). InnerVolumeSpecName "kube-api-access-q8qwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.858521 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2" (UID: "c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.859310 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-kube-api-access-xt464" (OuterVolumeSpecName: "kube-api-access-xt464") pod "6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" (UID: "6729eb41-7fe9-4e39-bffa-0c3180e2e6ed"). InnerVolumeSpecName "kube-api-access-xt464". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.864343 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e169a786-1e32-414c-a44a-fa7afa25d04a-utilities\") pod \"e169a786-1e32-414c-a44a-fa7afa25d04a\" (UID: \"e169a786-1e32-414c-a44a-fa7afa25d04a\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.864776 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.864805 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.864819 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.864830 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt464\" (UniqueName: \"kubernetes.io/projected/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-kube-api-access-xt464\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.864842 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8qwk\" (UniqueName: \"kubernetes.io/projected/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2-kube-api-access-q8qwk\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.864994 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e169a786-1e32-414c-a44a-fa7afa25d04a-utilities" (OuterVolumeSpecName: "utilities") pod "e169a786-1e32-414c-a44a-fa7afa25d04a" (UID: "e169a786-1e32-414c-a44a-fa7afa25d04a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.901932 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" (UID: "6729eb41-7fe9-4e39-bffa-0c3180e2e6ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.965599 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s98gc\" (UniqueName: \"kubernetes.io/projected/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-kube-api-access-s98gc\") pod \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\" (UID: \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.965643 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7czk\" (UniqueName: \"kubernetes.io/projected/e169a786-1e32-414c-a44a-fa7afa25d04a-kube-api-access-j7czk\") pod \"e169a786-1e32-414c-a44a-fa7afa25d04a\" (UID: \"e169a786-1e32-414c-a44a-fa7afa25d04a\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.965667 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2c1132f-fe3a-4e68-bc7d-0362951208ac-catalog-content\") pod \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\" (UID: \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.965696 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2c1132f-fe3a-4e68-bc7d-0362951208ac-utilities\") pod \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\" (UID: \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.965756 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e169a786-1e32-414c-a44a-fa7afa25d04a-catalog-content\") pod \"e169a786-1e32-414c-a44a-fa7afa25d04a\" (UID: \"e169a786-1e32-414c-a44a-fa7afa25d04a\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.965778 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8whlc\" (UniqueName: \"kubernetes.io/projected/c2c1132f-fe3a-4e68-bc7d-0362951208ac-kube-api-access-8whlc\") pod \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\" (UID: \"c2c1132f-fe3a-4e68-bc7d-0362951208ac\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.965834 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-utilities\") pod \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\" (UID: \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.965859 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-catalog-content\") pod \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\" (UID: \"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea\") " Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.966066 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.966083 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e169a786-1e32-414c-a44a-fa7afa25d04a-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.968119 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e169a786-1e32-414c-a44a-fa7afa25d04a-kube-api-access-j7czk" (OuterVolumeSpecName: "kube-api-access-j7czk") pod "e169a786-1e32-414c-a44a-fa7afa25d04a" (UID: "e169a786-1e32-414c-a44a-fa7afa25d04a"). InnerVolumeSpecName "kube-api-access-j7czk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.968478 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-kube-api-access-s98gc" (OuterVolumeSpecName: "kube-api-access-s98gc") pod "a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" (UID: "a5e4a715-7c4f-42c0-a4a0-1de63c1482ea"). InnerVolumeSpecName "kube-api-access-s98gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.969193 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-utilities" (OuterVolumeSpecName: "utilities") pod "a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" (UID: "a5e4a715-7c4f-42c0-a4a0-1de63c1482ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.969247 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2c1132f-fe3a-4e68-bc7d-0362951208ac-utilities" (OuterVolumeSpecName: "utilities") pod "c2c1132f-fe3a-4e68-bc7d-0362951208ac" (UID: "c2c1132f-fe3a-4e68-bc7d-0362951208ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.970604 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c1132f-fe3a-4e68-bc7d-0362951208ac-kube-api-access-8whlc" (OuterVolumeSpecName: "kube-api-access-8whlc") pod "c2c1132f-fe3a-4e68-bc7d-0362951208ac" (UID: "c2c1132f-fe3a-4e68-bc7d-0362951208ac"). InnerVolumeSpecName "kube-api-access-8whlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:06:41 crc kubenswrapper[4764]: I1001 16:06:41.982501 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2c1132f-fe3a-4e68-bc7d-0362951208ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2c1132f-fe3a-4e68-bc7d-0362951208ac" (UID: "c2c1132f-fe3a-4e68-bc7d-0362951208ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.014492 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" (UID: "a5e4a715-7c4f-42c0-a4a0-1de63c1482ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.055280 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e169a786-1e32-414c-a44a-fa7afa25d04a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e169a786-1e32-414c-a44a-fa7afa25d04a" (UID: "e169a786-1e32-414c-a44a-fa7afa25d04a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.066970 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2c1132f-fe3a-4e68-bc7d-0362951208ac-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.067009 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2c1132f-fe3a-4e68-bc7d-0362951208ac-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.067022 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e169a786-1e32-414c-a44a-fa7afa25d04a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.067034 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8whlc\" (UniqueName: \"kubernetes.io/projected/c2c1132f-fe3a-4e68-bc7d-0362951208ac-kube-api-access-8whlc\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.067063 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.067075 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.067086 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s98gc\" (UniqueName: \"kubernetes.io/projected/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea-kube-api-access-s98gc\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.067099 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7czk\" (UniqueName: \"kubernetes.io/projected/e169a786-1e32-414c-a44a-fa7afa25d04a-kube-api-access-j7czk\") on node \"crc\" DevicePath \"\"" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.069854 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xdv29"] Oct 01 16:06:42 crc kubenswrapper[4764]: W1001 16:06:42.075228 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod176eab78_eb2e_4612_994f_d13d95e6c80d.slice/crio-265925d00d4f5f70267e43a901034fc106f881da0664ba3afff62b873c5ba66f WatchSource:0}: Error finding container 265925d00d4f5f70267e43a901034fc106f881da0664ba3afff62b873c5ba66f: Status 404 returned error can't find the container with id 265925d00d4f5f70267e43a901034fc106f881da0664ba3afff62b873c5ba66f Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.193918 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" event={"ID":"176eab78-eb2e-4612-994f-d13d95e6c80d","Type":"ContainerStarted","Data":"ce9eca30fbbb960c92af6f49db65865a46e0d725c0a2aa6e133589e241cb62e9"} Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.193969 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" event={"ID":"176eab78-eb2e-4612-994f-d13d95e6c80d","Type":"ContainerStarted","Data":"265925d00d4f5f70267e43a901034fc106f881da0664ba3afff62b873c5ba66f"} Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.195678 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.200386 4764 generic.go:334] "Generic (PLEG): container finished" podID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" containerID="030c6cf18af52c2eff68129708c5ec799a3a024c654757257c5ec9ccb8cc0fb9" exitCode=0 Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.200460 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5pm5" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.200455 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5pm5" event={"ID":"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea","Type":"ContainerDied","Data":"030c6cf18af52c2eff68129708c5ec799a3a024c654757257c5ec9ccb8cc0fb9"} Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.200607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5pm5" event={"ID":"a5e4a715-7c4f-42c0-a4a0-1de63c1482ea","Type":"ContainerDied","Data":"3f9d70b11151ad87a2fa1b1d7d19cd5ab5d79281dfac6e1b38b738301de1bea0"} Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.200621 4764 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xdv29 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.200642 4764 scope.go:117] "RemoveContainer" containerID="030c6cf18af52c2eff68129708c5ec799a3a024c654757257c5ec9ccb8cc0fb9" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.200657 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" podUID="176eab78-eb2e-4612-994f-d13d95e6c80d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.214427 4764 generic.go:334] "Generic (PLEG): container finished" podID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" containerID="bf157f0d52df931228ee5a2b0988d6a1cd48a2a1c46b20b12945e3e9160ba90d" exitCode=0 Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.214535 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzcqj" event={"ID":"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed","Type":"ContainerDied","Data":"bf157f0d52df931228ee5a2b0988d6a1cd48a2a1c46b20b12945e3e9160ba90d"} Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.214571 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qzcqj" event={"ID":"6729eb41-7fe9-4e39-bffa-0c3180e2e6ed","Type":"ContainerDied","Data":"6626a0ceceec15cf1c3f12e4019c28446891c9a26f921f305a267b0b60da79c8"} Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.214664 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qzcqj" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.215589 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" podStartSLOduration=1.215576311 podStartE2EDuration="1.215576311s" podCreationTimestamp="2025-10-01 16:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:06:42.214868853 +0000 UTC m=+265.214515698" watchObservedRunningTime="2025-10-01 16:06:42.215576311 +0000 UTC m=+265.215223156" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.217629 4764 generic.go:334] "Generic (PLEG): container finished" podID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" containerID="13619eb6b7f53b9774b18b081a0f0cae6a88be2f362df52adc5542735c34e55d" exitCode=0 Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.217841 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzspt" event={"ID":"c2c1132f-fe3a-4e68-bc7d-0362951208ac","Type":"ContainerDied","Data":"13619eb6b7f53b9774b18b081a0f0cae6a88be2f362df52adc5542735c34e55d"} Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.217942 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzspt" event={"ID":"c2c1132f-fe3a-4e68-bc7d-0362951208ac","Type":"ContainerDied","Data":"f9a63b3d7da6e6caff0e91bd58a0a30f8b5db49b74e54025034b97711b6e21ea"} Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.218111 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzspt" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.223989 4764 generic.go:334] "Generic (PLEG): container finished" podID="c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2" containerID="7065957328a7799ae3dcef5ecdd3eb2a47668cac2b9b2fe1ad3aec93514e950f" exitCode=0 Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.224035 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" event={"ID":"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2","Type":"ContainerDied","Data":"7065957328a7799ae3dcef5ecdd3eb2a47668cac2b9b2fe1ad3aec93514e950f"} Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.224102 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" event={"ID":"c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2","Type":"ContainerDied","Data":"1eda25385477d0272ffeba455178d403f4af5b68315209afa188ce8b7ff8b2fb"} Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.226863 4764 generic.go:334] "Generic (PLEG): container finished" podID="e169a786-1e32-414c-a44a-fa7afa25d04a" containerID="bfd675fc6bd2eb86680605f0451f6991fcb9817a194a13e69424b776cbba52b5" exitCode=0 Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.226894 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dr8vb" event={"ID":"e169a786-1e32-414c-a44a-fa7afa25d04a","Type":"ContainerDied","Data":"bfd675fc6bd2eb86680605f0451f6991fcb9817a194a13e69424b776cbba52b5"} Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.226913 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dr8vb" event={"ID":"e169a786-1e32-414c-a44a-fa7afa25d04a","Type":"ContainerDied","Data":"0d810dabe9002c52028f54f763bdff8d77681f7749dc47275d156fec230057ca"} Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.226986 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dr8vb" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.228122 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cd5qk" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.242958 4764 scope.go:117] "RemoveContainer" containerID="9288a476be7273f86a7e81d0cc843a354f269ab4d8c460a94b31a13813e3edcb" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.253778 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q5pm5"] Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.257655 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q5pm5"] Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.262910 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qzcqj"] Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.271486 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qzcqj"] Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.285512 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dr8vb"] Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.286197 4764 scope.go:117] "RemoveContainer" containerID="a9ccd932d4cdf51fb2de5cc71fe07e300f26630158df88bec64e506d439618c1" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.295674 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dr8vb"] Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.299410 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzspt"] Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.305295 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzspt"] Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.306923 4764 scope.go:117] "RemoveContainer" containerID="030c6cf18af52c2eff68129708c5ec799a3a024c654757257c5ec9ccb8cc0fb9" Oct 01 16:06:42 crc kubenswrapper[4764]: E1001 16:06:42.307515 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"030c6cf18af52c2eff68129708c5ec799a3a024c654757257c5ec9ccb8cc0fb9\": container with ID starting with 030c6cf18af52c2eff68129708c5ec799a3a024c654757257c5ec9ccb8cc0fb9 not found: ID does not exist" containerID="030c6cf18af52c2eff68129708c5ec799a3a024c654757257c5ec9ccb8cc0fb9" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.307557 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"030c6cf18af52c2eff68129708c5ec799a3a024c654757257c5ec9ccb8cc0fb9"} err="failed to get container status \"030c6cf18af52c2eff68129708c5ec799a3a024c654757257c5ec9ccb8cc0fb9\": rpc error: code = NotFound desc = could not find container \"030c6cf18af52c2eff68129708c5ec799a3a024c654757257c5ec9ccb8cc0fb9\": container with ID starting with 030c6cf18af52c2eff68129708c5ec799a3a024c654757257c5ec9ccb8cc0fb9 not found: ID does not exist" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.307582 4764 scope.go:117] "RemoveContainer" containerID="9288a476be7273f86a7e81d0cc843a354f269ab4d8c460a94b31a13813e3edcb" Oct 01 16:06:42 crc kubenswrapper[4764]: E1001 16:06:42.307910 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9288a476be7273f86a7e81d0cc843a354f269ab4d8c460a94b31a13813e3edcb\": container with ID starting with 9288a476be7273f86a7e81d0cc843a354f269ab4d8c460a94b31a13813e3edcb not found: ID does not exist" containerID="9288a476be7273f86a7e81d0cc843a354f269ab4d8c460a94b31a13813e3edcb" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.307997 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9288a476be7273f86a7e81d0cc843a354f269ab4d8c460a94b31a13813e3edcb"} err="failed to get container status \"9288a476be7273f86a7e81d0cc843a354f269ab4d8c460a94b31a13813e3edcb\": rpc error: code = NotFound desc = could not find container \"9288a476be7273f86a7e81d0cc843a354f269ab4d8c460a94b31a13813e3edcb\": container with ID starting with 9288a476be7273f86a7e81d0cc843a354f269ab4d8c460a94b31a13813e3edcb not found: ID does not exist" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.308013 4764 scope.go:117] "RemoveContainer" containerID="a9ccd932d4cdf51fb2de5cc71fe07e300f26630158df88bec64e506d439618c1" Oct 01 16:06:42 crc kubenswrapper[4764]: E1001 16:06:42.308548 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ccd932d4cdf51fb2de5cc71fe07e300f26630158df88bec64e506d439618c1\": container with ID starting with a9ccd932d4cdf51fb2de5cc71fe07e300f26630158df88bec64e506d439618c1 not found: ID does not exist" containerID="a9ccd932d4cdf51fb2de5cc71fe07e300f26630158df88bec64e506d439618c1" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.308572 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ccd932d4cdf51fb2de5cc71fe07e300f26630158df88bec64e506d439618c1"} err="failed to get container status \"a9ccd932d4cdf51fb2de5cc71fe07e300f26630158df88bec64e506d439618c1\": rpc error: code = NotFound desc = could not find container \"a9ccd932d4cdf51fb2de5cc71fe07e300f26630158df88bec64e506d439618c1\": container with ID starting with a9ccd932d4cdf51fb2de5cc71fe07e300f26630158df88bec64e506d439618c1 not found: ID does not exist" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.308597 4764 scope.go:117] "RemoveContainer" containerID="bf157f0d52df931228ee5a2b0988d6a1cd48a2a1c46b20b12945e3e9160ba90d" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.311134 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cd5qk"] Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.314321 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cd5qk"] Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.321694 4764 scope.go:117] "RemoveContainer" containerID="52f4de259d6270ced48829094f9148483bebbf984a386fdef6557599ad13ca4a" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.334510 4764 scope.go:117] "RemoveContainer" containerID="5491cd7c371fe063af2cbf0f9415c482b8a096aff01730324b2d3d2c684ac88f" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.346715 4764 scope.go:117] "RemoveContainer" containerID="bf157f0d52df931228ee5a2b0988d6a1cd48a2a1c46b20b12945e3e9160ba90d" Oct 01 16:06:42 crc kubenswrapper[4764]: E1001 16:06:42.347037 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf157f0d52df931228ee5a2b0988d6a1cd48a2a1c46b20b12945e3e9160ba90d\": container with ID starting with bf157f0d52df931228ee5a2b0988d6a1cd48a2a1c46b20b12945e3e9160ba90d not found: ID does not exist" containerID="bf157f0d52df931228ee5a2b0988d6a1cd48a2a1c46b20b12945e3e9160ba90d" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.347091 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf157f0d52df931228ee5a2b0988d6a1cd48a2a1c46b20b12945e3e9160ba90d"} err="failed to get container status \"bf157f0d52df931228ee5a2b0988d6a1cd48a2a1c46b20b12945e3e9160ba90d\": rpc error: code = NotFound desc = could not find container \"bf157f0d52df931228ee5a2b0988d6a1cd48a2a1c46b20b12945e3e9160ba90d\": container with ID starting with bf157f0d52df931228ee5a2b0988d6a1cd48a2a1c46b20b12945e3e9160ba90d not found: ID does not exist" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.347120 4764 scope.go:117] "RemoveContainer" containerID="52f4de259d6270ced48829094f9148483bebbf984a386fdef6557599ad13ca4a" Oct 01 16:06:42 crc kubenswrapper[4764]: E1001 16:06:42.347460 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52f4de259d6270ced48829094f9148483bebbf984a386fdef6557599ad13ca4a\": container with ID starting with 52f4de259d6270ced48829094f9148483bebbf984a386fdef6557599ad13ca4a not found: ID does not exist" containerID="52f4de259d6270ced48829094f9148483bebbf984a386fdef6557599ad13ca4a" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.347488 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f4de259d6270ced48829094f9148483bebbf984a386fdef6557599ad13ca4a"} err="failed to get container status \"52f4de259d6270ced48829094f9148483bebbf984a386fdef6557599ad13ca4a\": rpc error: code = NotFound desc = could not find container \"52f4de259d6270ced48829094f9148483bebbf984a386fdef6557599ad13ca4a\": container with ID starting with 52f4de259d6270ced48829094f9148483bebbf984a386fdef6557599ad13ca4a not found: ID does not exist" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.347507 4764 scope.go:117] "RemoveContainer" containerID="5491cd7c371fe063af2cbf0f9415c482b8a096aff01730324b2d3d2c684ac88f" Oct 01 16:06:42 crc kubenswrapper[4764]: E1001 16:06:42.347739 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5491cd7c371fe063af2cbf0f9415c482b8a096aff01730324b2d3d2c684ac88f\": container with ID starting with 5491cd7c371fe063af2cbf0f9415c482b8a096aff01730324b2d3d2c684ac88f not found: ID does not exist" containerID="5491cd7c371fe063af2cbf0f9415c482b8a096aff01730324b2d3d2c684ac88f" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.347763 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5491cd7c371fe063af2cbf0f9415c482b8a096aff01730324b2d3d2c684ac88f"} err="failed to get container status \"5491cd7c371fe063af2cbf0f9415c482b8a096aff01730324b2d3d2c684ac88f\": rpc error: code = NotFound desc = could not find container \"5491cd7c371fe063af2cbf0f9415c482b8a096aff01730324b2d3d2c684ac88f\": container with ID starting with 5491cd7c371fe063af2cbf0f9415c482b8a096aff01730324b2d3d2c684ac88f not found: ID does not exist" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.347781 4764 scope.go:117] "RemoveContainer" containerID="13619eb6b7f53b9774b18b081a0f0cae6a88be2f362df52adc5542735c34e55d" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.362666 4764 scope.go:117] "RemoveContainer" containerID="d5352cd76c2893dd6f9124969e6ae705333bfb7629e51c99850fc23f2dea56e1" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.406385 4764 scope.go:117] "RemoveContainer" containerID="8cdb8f32da9de2d7f84f9dab7e2c4f597c99fe90f71d782199f7df259225e9d5" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.421343 4764 scope.go:117] "RemoveContainer" containerID="13619eb6b7f53b9774b18b081a0f0cae6a88be2f362df52adc5542735c34e55d" Oct 01 16:06:42 crc kubenswrapper[4764]: E1001 16:06:42.421734 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13619eb6b7f53b9774b18b081a0f0cae6a88be2f362df52adc5542735c34e55d\": container with ID starting with 13619eb6b7f53b9774b18b081a0f0cae6a88be2f362df52adc5542735c34e55d not found: ID does not exist" containerID="13619eb6b7f53b9774b18b081a0f0cae6a88be2f362df52adc5542735c34e55d" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.421773 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13619eb6b7f53b9774b18b081a0f0cae6a88be2f362df52adc5542735c34e55d"} err="failed to get container status \"13619eb6b7f53b9774b18b081a0f0cae6a88be2f362df52adc5542735c34e55d\": rpc error: code = NotFound desc = could not find container \"13619eb6b7f53b9774b18b081a0f0cae6a88be2f362df52adc5542735c34e55d\": container with ID starting with 13619eb6b7f53b9774b18b081a0f0cae6a88be2f362df52adc5542735c34e55d not found: ID does not exist" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.421800 4764 scope.go:117] "RemoveContainer" containerID="d5352cd76c2893dd6f9124969e6ae705333bfb7629e51c99850fc23f2dea56e1" Oct 01 16:06:42 crc kubenswrapper[4764]: E1001 16:06:42.422152 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5352cd76c2893dd6f9124969e6ae705333bfb7629e51c99850fc23f2dea56e1\": container with ID starting with d5352cd76c2893dd6f9124969e6ae705333bfb7629e51c99850fc23f2dea56e1 not found: ID does not exist" containerID="d5352cd76c2893dd6f9124969e6ae705333bfb7629e51c99850fc23f2dea56e1" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.422177 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5352cd76c2893dd6f9124969e6ae705333bfb7629e51c99850fc23f2dea56e1"} err="failed to get container status \"d5352cd76c2893dd6f9124969e6ae705333bfb7629e51c99850fc23f2dea56e1\": rpc error: code = NotFound desc = could not find container \"d5352cd76c2893dd6f9124969e6ae705333bfb7629e51c99850fc23f2dea56e1\": container with ID starting with d5352cd76c2893dd6f9124969e6ae705333bfb7629e51c99850fc23f2dea56e1 not found: ID does not exist" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.422199 4764 scope.go:117] "RemoveContainer" containerID="8cdb8f32da9de2d7f84f9dab7e2c4f597c99fe90f71d782199f7df259225e9d5" Oct 01 16:06:42 crc kubenswrapper[4764]: E1001 16:06:42.422632 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cdb8f32da9de2d7f84f9dab7e2c4f597c99fe90f71d782199f7df259225e9d5\": container with ID starting with 8cdb8f32da9de2d7f84f9dab7e2c4f597c99fe90f71d782199f7df259225e9d5 not found: ID does not exist" containerID="8cdb8f32da9de2d7f84f9dab7e2c4f597c99fe90f71d782199f7df259225e9d5" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.422652 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cdb8f32da9de2d7f84f9dab7e2c4f597c99fe90f71d782199f7df259225e9d5"} err="failed to get container status \"8cdb8f32da9de2d7f84f9dab7e2c4f597c99fe90f71d782199f7df259225e9d5\": rpc error: code = NotFound desc = could not find container \"8cdb8f32da9de2d7f84f9dab7e2c4f597c99fe90f71d782199f7df259225e9d5\": container with ID starting with 8cdb8f32da9de2d7f84f9dab7e2c4f597c99fe90f71d782199f7df259225e9d5 not found: ID does not exist" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.422671 4764 scope.go:117] "RemoveContainer" containerID="7065957328a7799ae3dcef5ecdd3eb2a47668cac2b9b2fe1ad3aec93514e950f" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.435102 4764 scope.go:117] "RemoveContainer" containerID="7065957328a7799ae3dcef5ecdd3eb2a47668cac2b9b2fe1ad3aec93514e950f" Oct 01 16:06:42 crc kubenswrapper[4764]: E1001 16:06:42.435562 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7065957328a7799ae3dcef5ecdd3eb2a47668cac2b9b2fe1ad3aec93514e950f\": container with ID starting with 7065957328a7799ae3dcef5ecdd3eb2a47668cac2b9b2fe1ad3aec93514e950f not found: ID does not exist" containerID="7065957328a7799ae3dcef5ecdd3eb2a47668cac2b9b2fe1ad3aec93514e950f" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.435603 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7065957328a7799ae3dcef5ecdd3eb2a47668cac2b9b2fe1ad3aec93514e950f"} err="failed to get container status \"7065957328a7799ae3dcef5ecdd3eb2a47668cac2b9b2fe1ad3aec93514e950f\": rpc error: code = NotFound desc = could not find container \"7065957328a7799ae3dcef5ecdd3eb2a47668cac2b9b2fe1ad3aec93514e950f\": container with ID starting with 7065957328a7799ae3dcef5ecdd3eb2a47668cac2b9b2fe1ad3aec93514e950f not found: ID does not exist" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.435632 4764 scope.go:117] "RemoveContainer" containerID="bfd675fc6bd2eb86680605f0451f6991fcb9817a194a13e69424b776cbba52b5" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.450395 4764 scope.go:117] "RemoveContainer" containerID="6fa886904d6bb3d1f23d7d8143bd2fbeb926f33b48e2425b2da3367a0a9d9856" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.465763 4764 scope.go:117] "RemoveContainer" containerID="49834577eac686a04b3be23e2533323498b0d1292538cc4ec79b805ae94b8399" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.481550 4764 scope.go:117] "RemoveContainer" containerID="bfd675fc6bd2eb86680605f0451f6991fcb9817a194a13e69424b776cbba52b5" Oct 01 16:06:42 crc kubenswrapper[4764]: E1001 16:06:42.483003 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd675fc6bd2eb86680605f0451f6991fcb9817a194a13e69424b776cbba52b5\": container with ID starting with bfd675fc6bd2eb86680605f0451f6991fcb9817a194a13e69424b776cbba52b5 not found: ID does not exist" containerID="bfd675fc6bd2eb86680605f0451f6991fcb9817a194a13e69424b776cbba52b5" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.483058 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd675fc6bd2eb86680605f0451f6991fcb9817a194a13e69424b776cbba52b5"} err="failed to get container status \"bfd675fc6bd2eb86680605f0451f6991fcb9817a194a13e69424b776cbba52b5\": rpc error: code = NotFound desc = could not find container \"bfd675fc6bd2eb86680605f0451f6991fcb9817a194a13e69424b776cbba52b5\": container with ID starting with bfd675fc6bd2eb86680605f0451f6991fcb9817a194a13e69424b776cbba52b5 not found: ID does not exist" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.483079 4764 scope.go:117] "RemoveContainer" containerID="6fa886904d6bb3d1f23d7d8143bd2fbeb926f33b48e2425b2da3367a0a9d9856" Oct 01 16:06:42 crc kubenswrapper[4764]: E1001 16:06:42.483841 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa886904d6bb3d1f23d7d8143bd2fbeb926f33b48e2425b2da3367a0a9d9856\": container with ID starting with 6fa886904d6bb3d1f23d7d8143bd2fbeb926f33b48e2425b2da3367a0a9d9856 not found: ID does not exist" containerID="6fa886904d6bb3d1f23d7d8143bd2fbeb926f33b48e2425b2da3367a0a9d9856" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.483864 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa886904d6bb3d1f23d7d8143bd2fbeb926f33b48e2425b2da3367a0a9d9856"} err="failed to get container status \"6fa886904d6bb3d1f23d7d8143bd2fbeb926f33b48e2425b2da3367a0a9d9856\": rpc error: code = NotFound desc = could not find container \"6fa886904d6bb3d1f23d7d8143bd2fbeb926f33b48e2425b2da3367a0a9d9856\": container with ID starting with 6fa886904d6bb3d1f23d7d8143bd2fbeb926f33b48e2425b2da3367a0a9d9856 not found: ID does not exist" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.483880 4764 scope.go:117] "RemoveContainer" containerID="49834577eac686a04b3be23e2533323498b0d1292538cc4ec79b805ae94b8399" Oct 01 16:06:42 crc kubenswrapper[4764]: E1001 16:06:42.484209 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49834577eac686a04b3be23e2533323498b0d1292538cc4ec79b805ae94b8399\": container with ID starting with 49834577eac686a04b3be23e2533323498b0d1292538cc4ec79b805ae94b8399 not found: ID does not exist" containerID="49834577eac686a04b3be23e2533323498b0d1292538cc4ec79b805ae94b8399" Oct 01 16:06:42 crc kubenswrapper[4764]: I1001 16:06:42.484227 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49834577eac686a04b3be23e2533323498b0d1292538cc4ec79b805ae94b8399"} err="failed to get container status \"49834577eac686a04b3be23e2533323498b0d1292538cc4ec79b805ae94b8399\": rpc error: code = NotFound desc = could not find container \"49834577eac686a04b3be23e2533323498b0d1292538cc4ec79b805ae94b8399\": container with ID starting with 49834577eac686a04b3be23e2533323498b0d1292538cc4ec79b805ae94b8399 not found: ID does not exist" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.215452 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qn928"] Oct 01 16:06:43 crc kubenswrapper[4764]: E1001 16:06:43.216343 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" containerName="extract-content" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216370 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" containerName="extract-content" Oct 01 16:06:43 crc kubenswrapper[4764]: E1001 16:06:43.216385 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" containerName="extract-content" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216393 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" containerName="extract-content" Oct 01 16:06:43 crc kubenswrapper[4764]: E1001 16:06:43.216405 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" containerName="registry-server" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216415 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" containerName="registry-server" Oct 01 16:06:43 crc kubenswrapper[4764]: E1001 16:06:43.216424 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" containerName="registry-server" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216432 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" containerName="registry-server" Oct 01 16:06:43 crc kubenswrapper[4764]: E1001 16:06:43.216440 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" containerName="registry-server" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216446 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" containerName="registry-server" Oct 01 16:06:43 crc kubenswrapper[4764]: E1001 16:06:43.216469 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e169a786-1e32-414c-a44a-fa7afa25d04a" containerName="extract-utilities" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216476 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e169a786-1e32-414c-a44a-fa7afa25d04a" containerName="extract-utilities" Oct 01 16:06:43 crc kubenswrapper[4764]: E1001 16:06:43.216487 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e169a786-1e32-414c-a44a-fa7afa25d04a" containerName="extract-content" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216494 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e169a786-1e32-414c-a44a-fa7afa25d04a" containerName="extract-content" Oct 01 16:06:43 crc kubenswrapper[4764]: E1001 16:06:43.216504 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e169a786-1e32-414c-a44a-fa7afa25d04a" containerName="registry-server" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216513 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e169a786-1e32-414c-a44a-fa7afa25d04a" containerName="registry-server" Oct 01 16:06:43 crc kubenswrapper[4764]: E1001 16:06:43.216522 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2" containerName="marketplace-operator" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216529 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2" containerName="marketplace-operator" Oct 01 16:06:43 crc kubenswrapper[4764]: E1001 16:06:43.216537 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" containerName="extract-utilities" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216545 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" containerName="extract-utilities" Oct 01 16:06:43 crc kubenswrapper[4764]: E1001 16:06:43.216555 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" containerName="extract-content" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216562 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" containerName="extract-content" Oct 01 16:06:43 crc kubenswrapper[4764]: E1001 16:06:43.216576 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" containerName="extract-utilities" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216583 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" containerName="extract-utilities" Oct 01 16:06:43 crc kubenswrapper[4764]: E1001 16:06:43.216593 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" containerName="extract-utilities" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216601 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" containerName="extract-utilities" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216742 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" containerName="registry-server" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216756 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2" containerName="marketplace-operator" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216765 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e169a786-1e32-414c-a44a-fa7afa25d04a" containerName="registry-server" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216787 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" containerName="registry-server" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.216797 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" containerName="registry-server" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.217522 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.219461 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.232088 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qn928"] Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.240555 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xdv29" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.382590 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ced474e-2212-4ae4-b305-fbc1f0e05a93-utilities\") pod \"certified-operators-qn928\" (UID: \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\") " pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.382687 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s9zt\" (UniqueName: \"kubernetes.io/projected/3ced474e-2212-4ae4-b305-fbc1f0e05a93-kube-api-access-8s9zt\") pod \"certified-operators-qn928\" (UID: \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\") " pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.382726 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ced474e-2212-4ae4-b305-fbc1f0e05a93-catalog-content\") pod \"certified-operators-qn928\" (UID: \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\") " pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.483808 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ced474e-2212-4ae4-b305-fbc1f0e05a93-utilities\") pod \"certified-operators-qn928\" (UID: \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\") " pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.483884 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s9zt\" (UniqueName: \"kubernetes.io/projected/3ced474e-2212-4ae4-b305-fbc1f0e05a93-kube-api-access-8s9zt\") pod \"certified-operators-qn928\" (UID: \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\") " pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.483905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ced474e-2212-4ae4-b305-fbc1f0e05a93-catalog-content\") pod \"certified-operators-qn928\" (UID: \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\") " pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.484561 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ced474e-2212-4ae4-b305-fbc1f0e05a93-utilities\") pod \"certified-operators-qn928\" (UID: \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\") " pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.485331 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ced474e-2212-4ae4-b305-fbc1f0e05a93-catalog-content\") pod \"certified-operators-qn928\" (UID: \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\") " pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.502814 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s9zt\" (UniqueName: \"kubernetes.io/projected/3ced474e-2212-4ae4-b305-fbc1f0e05a93-kube-api-access-8s9zt\") pod \"certified-operators-qn928\" (UID: \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\") " pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.531831 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.727931 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6729eb41-7fe9-4e39-bffa-0c3180e2e6ed" path="/var/lib/kubelet/pods/6729eb41-7fe9-4e39-bffa-0c3180e2e6ed/volumes" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.728630 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e4a715-7c4f-42c0-a4a0-1de63c1482ea" path="/var/lib/kubelet/pods/a5e4a715-7c4f-42c0-a4a0-1de63c1482ea/volumes" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.729429 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c1132f-fe3a-4e68-bc7d-0362951208ac" path="/var/lib/kubelet/pods/c2c1132f-fe3a-4e68-bc7d-0362951208ac/volumes" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.730439 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2" path="/var/lib/kubelet/pods/c932b6b4-a3cf-45d5-9f11-1f9feff6fdb2/volumes" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.730853 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e169a786-1e32-414c-a44a-fa7afa25d04a" path="/var/lib/kubelet/pods/e169a786-1e32-414c-a44a-fa7afa25d04a/volumes" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.820333 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dvfhb"] Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.821573 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.823395 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.825710 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvfhb"] Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.898966 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qn928"] Oct 01 16:06:43 crc kubenswrapper[4764]: W1001 16:06:43.905308 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ced474e_2212_4ae4_b305_fbc1f0e05a93.slice/crio-40b998128c6d28131fe4e858f32e7709675b31011faa571ca297c4b89f121fcd WatchSource:0}: Error finding container 40b998128c6d28131fe4e858f32e7709675b31011faa571ca297c4b89f121fcd: Status 404 returned error can't find the container with id 40b998128c6d28131fe4e858f32e7709675b31011faa571ca297c4b89f121fcd Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.989180 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dad4be-a7b9-48f9-9259-fbaffbb22bd2-catalog-content\") pod \"redhat-marketplace-dvfhb\" (UID: \"c7dad4be-a7b9-48f9-9259-fbaffbb22bd2\") " pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.989231 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntjqv\" (UniqueName: \"kubernetes.io/projected/c7dad4be-a7b9-48f9-9259-fbaffbb22bd2-kube-api-access-ntjqv\") pod \"redhat-marketplace-dvfhb\" (UID: \"c7dad4be-a7b9-48f9-9259-fbaffbb22bd2\") " pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:43 crc kubenswrapper[4764]: I1001 16:06:43.989254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dad4be-a7b9-48f9-9259-fbaffbb22bd2-utilities\") pod \"redhat-marketplace-dvfhb\" (UID: \"c7dad4be-a7b9-48f9-9259-fbaffbb22bd2\") " pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:44 crc kubenswrapper[4764]: I1001 16:06:44.091275 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dad4be-a7b9-48f9-9259-fbaffbb22bd2-catalog-content\") pod \"redhat-marketplace-dvfhb\" (UID: \"c7dad4be-a7b9-48f9-9259-fbaffbb22bd2\") " pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:44 crc kubenswrapper[4764]: I1001 16:06:44.091351 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntjqv\" (UniqueName: \"kubernetes.io/projected/c7dad4be-a7b9-48f9-9259-fbaffbb22bd2-kube-api-access-ntjqv\") pod \"redhat-marketplace-dvfhb\" (UID: \"c7dad4be-a7b9-48f9-9259-fbaffbb22bd2\") " pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:44 crc kubenswrapper[4764]: I1001 16:06:44.091395 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dad4be-a7b9-48f9-9259-fbaffbb22bd2-utilities\") pod \"redhat-marketplace-dvfhb\" (UID: \"c7dad4be-a7b9-48f9-9259-fbaffbb22bd2\") " pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:44 crc kubenswrapper[4764]: I1001 16:06:44.091834 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7dad4be-a7b9-48f9-9259-fbaffbb22bd2-catalog-content\") pod \"redhat-marketplace-dvfhb\" (UID: \"c7dad4be-a7b9-48f9-9259-fbaffbb22bd2\") " pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:44 crc kubenswrapper[4764]: I1001 16:06:44.092169 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7dad4be-a7b9-48f9-9259-fbaffbb22bd2-utilities\") pod \"redhat-marketplace-dvfhb\" (UID: \"c7dad4be-a7b9-48f9-9259-fbaffbb22bd2\") " pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:44 crc kubenswrapper[4764]: I1001 16:06:44.115242 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntjqv\" (UniqueName: \"kubernetes.io/projected/c7dad4be-a7b9-48f9-9259-fbaffbb22bd2-kube-api-access-ntjqv\") pod \"redhat-marketplace-dvfhb\" (UID: \"c7dad4be-a7b9-48f9-9259-fbaffbb22bd2\") " pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:44 crc kubenswrapper[4764]: I1001 16:06:44.142076 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:44 crc kubenswrapper[4764]: I1001 16:06:44.245183 4764 generic.go:334] "Generic (PLEG): container finished" podID="3ced474e-2212-4ae4-b305-fbc1f0e05a93" containerID="e5c7055262404416072f800f0a273246ea44ac972902c5c4aa55f33cbae39cce" exitCode=0 Oct 01 16:06:44 crc kubenswrapper[4764]: I1001 16:06:44.245293 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qn928" event={"ID":"3ced474e-2212-4ae4-b305-fbc1f0e05a93","Type":"ContainerDied","Data":"e5c7055262404416072f800f0a273246ea44ac972902c5c4aa55f33cbae39cce"} Oct 01 16:06:44 crc kubenswrapper[4764]: I1001 16:06:44.245609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qn928" event={"ID":"3ced474e-2212-4ae4-b305-fbc1f0e05a93","Type":"ContainerStarted","Data":"40b998128c6d28131fe4e858f32e7709675b31011faa571ca297c4b89f121fcd"} Oct 01 16:06:44 crc kubenswrapper[4764]: I1001 16:06:44.548961 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dvfhb"] Oct 01 16:06:44 crc kubenswrapper[4764]: W1001 16:06:44.555019 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7dad4be_a7b9_48f9_9259_fbaffbb22bd2.slice/crio-e5977a55e9e87b9149219266f14c3f0735141db5ff39687f149f70b400c195f0 WatchSource:0}: Error finding container e5977a55e9e87b9149219266f14c3f0735141db5ff39687f149f70b400c195f0: Status 404 returned error can't find the container with id e5977a55e9e87b9149219266f14c3f0735141db5ff39687f149f70b400c195f0 Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.253080 4764 generic.go:334] "Generic (PLEG): container finished" podID="c7dad4be-a7b9-48f9-9259-fbaffbb22bd2" containerID="d45e07adb4481af0b8e40ea490b0d59d775222b193f1defc23b6363682d57643" exitCode=0 Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.253171 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvfhb" event={"ID":"c7dad4be-a7b9-48f9-9259-fbaffbb22bd2","Type":"ContainerDied","Data":"d45e07adb4481af0b8e40ea490b0d59d775222b193f1defc23b6363682d57643"} Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.253413 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvfhb" event={"ID":"c7dad4be-a7b9-48f9-9259-fbaffbb22bd2","Type":"ContainerStarted","Data":"e5977a55e9e87b9149219266f14c3f0735141db5ff39687f149f70b400c195f0"} Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.255169 4764 generic.go:334] "Generic (PLEG): container finished" podID="3ced474e-2212-4ae4-b305-fbc1f0e05a93" containerID="671ea14b516816b121dfe830d1e16d8c2f1d696957ec68f1045f5694563bd81b" exitCode=0 Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.255254 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qn928" event={"ID":"3ced474e-2212-4ae4-b305-fbc1f0e05a93","Type":"ContainerDied","Data":"671ea14b516816b121dfe830d1e16d8c2f1d696957ec68f1045f5694563bd81b"} Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.617745 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-45tgq"] Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.618881 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.621234 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.628653 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-45tgq"] Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.711262 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z2v2\" (UniqueName: \"kubernetes.io/projected/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-kube-api-access-4z2v2\") pod \"redhat-operators-45tgq\" (UID: \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\") " pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.711780 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-catalog-content\") pod \"redhat-operators-45tgq\" (UID: \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\") " pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.711818 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-utilities\") pod \"redhat-operators-45tgq\" (UID: \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\") " pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.813944 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-catalog-content\") pod \"redhat-operators-45tgq\" (UID: \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\") " pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.814038 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-utilities\") pod \"redhat-operators-45tgq\" (UID: \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\") " pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.814155 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z2v2\" (UniqueName: \"kubernetes.io/projected/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-kube-api-access-4z2v2\") pod \"redhat-operators-45tgq\" (UID: \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\") " pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.814721 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-utilities\") pod \"redhat-operators-45tgq\" (UID: \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\") " pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.815506 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-catalog-content\") pod \"redhat-operators-45tgq\" (UID: \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\") " pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.834415 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z2v2\" (UniqueName: \"kubernetes.io/projected/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-kube-api-access-4z2v2\") pod \"redhat-operators-45tgq\" (UID: \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\") " pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:45 crc kubenswrapper[4764]: I1001 16:06:45.947208 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.215594 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-66v4b"] Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.217872 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.225257 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.229671 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-66v4b"] Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.265957 4764 generic.go:334] "Generic (PLEG): container finished" podID="c7dad4be-a7b9-48f9-9259-fbaffbb22bd2" containerID="eb2a7f8d666c0411b2432e73a0d650025030af3c3f1306a5dd2c03f7b00d0774" exitCode=0 Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.266024 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvfhb" event={"ID":"c7dad4be-a7b9-48f9-9259-fbaffbb22bd2","Type":"ContainerDied","Data":"eb2a7f8d666c0411b2432e73a0d650025030af3c3f1306a5dd2c03f7b00d0774"} Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.272436 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qn928" event={"ID":"3ced474e-2212-4ae4-b305-fbc1f0e05a93","Type":"ContainerStarted","Data":"7792bab82a75c60a89818ad1692f1c46bde38d2ac4a9f1cd5de1ded2b947ac82"} Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.329542 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1436d169-be0b-479d-8a35-084020e816a2-catalog-content\") pod \"community-operators-66v4b\" (UID: \"1436d169-be0b-479d-8a35-084020e816a2\") " pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.329601 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27ls5\" (UniqueName: \"kubernetes.io/projected/1436d169-be0b-479d-8a35-084020e816a2-kube-api-access-27ls5\") pod \"community-operators-66v4b\" (UID: \"1436d169-be0b-479d-8a35-084020e816a2\") " pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.329622 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1436d169-be0b-479d-8a35-084020e816a2-utilities\") pod \"community-operators-66v4b\" (UID: \"1436d169-be0b-479d-8a35-084020e816a2\") " pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.415862 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qn928" podStartSLOduration=2.006613851 podStartE2EDuration="3.415845127s" podCreationTimestamp="2025-10-01 16:06:43 +0000 UTC" firstStartedPulling="2025-10-01 16:06:44.257704486 +0000 UTC m=+267.257351311" lastFinishedPulling="2025-10-01 16:06:45.666935732 +0000 UTC m=+268.666582587" observedRunningTime="2025-10-01 16:06:46.302283305 +0000 UTC m=+269.301930150" watchObservedRunningTime="2025-10-01 16:06:46.415845127 +0000 UTC m=+269.415491962" Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.416648 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-45tgq"] Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.430430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1436d169-be0b-479d-8a35-084020e816a2-catalog-content\") pod \"community-operators-66v4b\" (UID: \"1436d169-be0b-479d-8a35-084020e816a2\") " pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.430653 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27ls5\" (UniqueName: \"kubernetes.io/projected/1436d169-be0b-479d-8a35-084020e816a2-kube-api-access-27ls5\") pod \"community-operators-66v4b\" (UID: \"1436d169-be0b-479d-8a35-084020e816a2\") " pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.430674 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1436d169-be0b-479d-8a35-084020e816a2-utilities\") pod \"community-operators-66v4b\" (UID: \"1436d169-be0b-479d-8a35-084020e816a2\") " pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:46 crc kubenswrapper[4764]: W1001 16:06:46.430839 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode13b9e77_673e_4b69_93e4_0bc9ab7fe544.slice/crio-ddc7c1d1148dae70a88806937413181257473f7645fb6b8899d0d7fba32bc481 WatchSource:0}: Error finding container ddc7c1d1148dae70a88806937413181257473f7645fb6b8899d0d7fba32bc481: Status 404 returned error can't find the container with id ddc7c1d1148dae70a88806937413181257473f7645fb6b8899d0d7fba32bc481 Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.431962 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1436d169-be0b-479d-8a35-084020e816a2-catalog-content\") pod \"community-operators-66v4b\" (UID: \"1436d169-be0b-479d-8a35-084020e816a2\") " pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.431968 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1436d169-be0b-479d-8a35-084020e816a2-utilities\") pod \"community-operators-66v4b\" (UID: \"1436d169-be0b-479d-8a35-084020e816a2\") " pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.449104 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27ls5\" (UniqueName: \"kubernetes.io/projected/1436d169-be0b-479d-8a35-084020e816a2-kube-api-access-27ls5\") pod \"community-operators-66v4b\" (UID: \"1436d169-be0b-479d-8a35-084020e816a2\") " pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.534090 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:46 crc kubenswrapper[4764]: I1001 16:06:46.926831 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-66v4b"] Oct 01 16:06:46 crc kubenswrapper[4764]: W1001 16:06:46.936128 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1436d169_be0b_479d_8a35_084020e816a2.slice/crio-ead7016d7536690fa0e435f327985c22cff0e304c47cd3e8e7be492db45da525 WatchSource:0}: Error finding container ead7016d7536690fa0e435f327985c22cff0e304c47cd3e8e7be492db45da525: Status 404 returned error can't find the container with id ead7016d7536690fa0e435f327985c22cff0e304c47cd3e8e7be492db45da525 Oct 01 16:06:47 crc kubenswrapper[4764]: I1001 16:06:47.280720 4764 generic.go:334] "Generic (PLEG): container finished" podID="1436d169-be0b-479d-8a35-084020e816a2" containerID="9ef82b358dd7857d928783e990622d1472e1727199b4365c14fb2593903bec5b" exitCode=0 Oct 01 16:06:47 crc kubenswrapper[4764]: I1001 16:06:47.280816 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66v4b" event={"ID":"1436d169-be0b-479d-8a35-084020e816a2","Type":"ContainerDied","Data":"9ef82b358dd7857d928783e990622d1472e1727199b4365c14fb2593903bec5b"} Oct 01 16:06:47 crc kubenswrapper[4764]: I1001 16:06:47.281212 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66v4b" event={"ID":"1436d169-be0b-479d-8a35-084020e816a2","Type":"ContainerStarted","Data":"ead7016d7536690fa0e435f327985c22cff0e304c47cd3e8e7be492db45da525"} Oct 01 16:06:47 crc kubenswrapper[4764]: I1001 16:06:47.283136 4764 generic.go:334] "Generic (PLEG): container finished" podID="e13b9e77-673e-4b69-93e4-0bc9ab7fe544" containerID="2273e6f657846bd9fb9e82687503a091491e1a4868a78349cf310450e31ca8dd" exitCode=0 Oct 01 16:06:47 crc kubenswrapper[4764]: I1001 16:06:47.283200 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tgq" event={"ID":"e13b9e77-673e-4b69-93e4-0bc9ab7fe544","Type":"ContainerDied","Data":"2273e6f657846bd9fb9e82687503a091491e1a4868a78349cf310450e31ca8dd"} Oct 01 16:06:47 crc kubenswrapper[4764]: I1001 16:06:47.283279 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tgq" event={"ID":"e13b9e77-673e-4b69-93e4-0bc9ab7fe544","Type":"ContainerStarted","Data":"ddc7c1d1148dae70a88806937413181257473f7645fb6b8899d0d7fba32bc481"} Oct 01 16:06:47 crc kubenswrapper[4764]: I1001 16:06:47.285696 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dvfhb" event={"ID":"c7dad4be-a7b9-48f9-9259-fbaffbb22bd2","Type":"ContainerStarted","Data":"0a7df463885adfb7146dab259e9898bb8f1d9c532c3b794ac309176a59307e91"} Oct 01 16:06:47 crc kubenswrapper[4764]: I1001 16:06:47.339847 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dvfhb" podStartSLOduration=2.907080815 podStartE2EDuration="4.339829952s" podCreationTimestamp="2025-10-01 16:06:43 +0000 UTC" firstStartedPulling="2025-10-01 16:06:45.254845328 +0000 UTC m=+268.254492163" lastFinishedPulling="2025-10-01 16:06:46.687594475 +0000 UTC m=+269.687241300" observedRunningTime="2025-10-01 16:06:47.338275782 +0000 UTC m=+270.337922617" watchObservedRunningTime="2025-10-01 16:06:47.339829952 +0000 UTC m=+270.339476787" Oct 01 16:06:48 crc kubenswrapper[4764]: I1001 16:06:48.292508 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tgq" event={"ID":"e13b9e77-673e-4b69-93e4-0bc9ab7fe544","Type":"ContainerStarted","Data":"9b69d7ec6eeeebecaa7414817e6d9f3df4d18d3ab60f0fb791e7d78300e5025d"} Oct 01 16:06:48 crc kubenswrapper[4764]: I1001 16:06:48.295286 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66v4b" event={"ID":"1436d169-be0b-479d-8a35-084020e816a2","Type":"ContainerStarted","Data":"4df7e47d530cc8db79449457ac212c862dd66496ab8084cd832197657e0bb0df"} Oct 01 16:06:49 crc kubenswrapper[4764]: I1001 16:06:49.304436 4764 generic.go:334] "Generic (PLEG): container finished" podID="1436d169-be0b-479d-8a35-084020e816a2" containerID="4df7e47d530cc8db79449457ac212c862dd66496ab8084cd832197657e0bb0df" exitCode=0 Oct 01 16:06:49 crc kubenswrapper[4764]: I1001 16:06:49.304514 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66v4b" event={"ID":"1436d169-be0b-479d-8a35-084020e816a2","Type":"ContainerDied","Data":"4df7e47d530cc8db79449457ac212c862dd66496ab8084cd832197657e0bb0df"} Oct 01 16:06:49 crc kubenswrapper[4764]: I1001 16:06:49.316274 4764 generic.go:334] "Generic (PLEG): container finished" podID="e13b9e77-673e-4b69-93e4-0bc9ab7fe544" containerID="9b69d7ec6eeeebecaa7414817e6d9f3df4d18d3ab60f0fb791e7d78300e5025d" exitCode=0 Oct 01 16:06:49 crc kubenswrapper[4764]: I1001 16:06:49.316364 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tgq" event={"ID":"e13b9e77-673e-4b69-93e4-0bc9ab7fe544","Type":"ContainerDied","Data":"9b69d7ec6eeeebecaa7414817e6d9f3df4d18d3ab60f0fb791e7d78300e5025d"} Oct 01 16:06:50 crc kubenswrapper[4764]: I1001 16:06:50.330387 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tgq" event={"ID":"e13b9e77-673e-4b69-93e4-0bc9ab7fe544","Type":"ContainerStarted","Data":"9cb143463121a4034e654ba91e76e2d7ae1c04b89073105055870c8783aa10a8"} Oct 01 16:06:50 crc kubenswrapper[4764]: I1001 16:06:50.336380 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-66v4b" event={"ID":"1436d169-be0b-479d-8a35-084020e816a2","Type":"ContainerStarted","Data":"5529cd88d84f9ab048e3ee779e42ea7608dc73443ca68ca46a3ad4b63a91356d"} Oct 01 16:06:50 crc kubenswrapper[4764]: I1001 16:06:50.351690 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-45tgq" podStartSLOduration=2.877480269 podStartE2EDuration="5.35167474s" podCreationTimestamp="2025-10-01 16:06:45 +0000 UTC" firstStartedPulling="2025-10-01 16:06:47.284427599 +0000 UTC m=+270.284074424" lastFinishedPulling="2025-10-01 16:06:49.75862206 +0000 UTC m=+272.758268895" observedRunningTime="2025-10-01 16:06:50.349535596 +0000 UTC m=+273.349182441" watchObservedRunningTime="2025-10-01 16:06:50.35167474 +0000 UTC m=+273.351321575" Oct 01 16:06:50 crc kubenswrapper[4764]: I1001 16:06:50.371469 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-66v4b" podStartSLOduration=1.7125234649999999 podStartE2EDuration="4.371447117s" podCreationTimestamp="2025-10-01 16:06:46 +0000 UTC" firstStartedPulling="2025-10-01 16:06:47.282905871 +0000 UTC m=+270.282552706" lastFinishedPulling="2025-10-01 16:06:49.941829523 +0000 UTC m=+272.941476358" observedRunningTime="2025-10-01 16:06:50.367538749 +0000 UTC m=+273.367185584" watchObservedRunningTime="2025-10-01 16:06:50.371447117 +0000 UTC m=+273.371093952" Oct 01 16:06:53 crc kubenswrapper[4764]: I1001 16:06:53.532808 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:53 crc kubenswrapper[4764]: I1001 16:06:53.533192 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:53 crc kubenswrapper[4764]: I1001 16:06:53.590135 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:54 crc kubenswrapper[4764]: I1001 16:06:54.142607 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:54 crc kubenswrapper[4764]: I1001 16:06:54.142683 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:54 crc kubenswrapper[4764]: I1001 16:06:54.195576 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:54 crc kubenswrapper[4764]: I1001 16:06:54.394318 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dvfhb" Oct 01 16:06:54 crc kubenswrapper[4764]: I1001 16:06:54.401693 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:06:55 crc kubenswrapper[4764]: I1001 16:06:55.948315 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:55 crc kubenswrapper[4764]: I1001 16:06:55.948667 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:56 crc kubenswrapper[4764]: I1001 16:06:56.028002 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:56 crc kubenswrapper[4764]: I1001 16:06:56.409522 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:06:56 crc kubenswrapper[4764]: I1001 16:06:56.535355 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:56 crc kubenswrapper[4764]: I1001 16:06:56.535420 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:56 crc kubenswrapper[4764]: I1001 16:06:56.584274 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:06:57 crc kubenswrapper[4764]: I1001 16:06:57.407220 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-66v4b" Oct 01 16:08:21 crc kubenswrapper[4764]: I1001 16:08:21.914528 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:08:21 crc kubenswrapper[4764]: I1001 16:08:21.915138 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:08:51 crc kubenswrapper[4764]: I1001 16:08:51.913956 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:08:51 crc kubenswrapper[4764]: I1001 16:08:51.914842 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:09:21 crc kubenswrapper[4764]: I1001 16:09:21.914733 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:09:21 crc kubenswrapper[4764]: I1001 16:09:21.915431 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:09:21 crc kubenswrapper[4764]: I1001 16:09:21.915497 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:09:21 crc kubenswrapper[4764]: I1001 16:09:21.916403 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"50ed732785efefc18292368412dcb52035a50e6aac0d6b7e5cfa2693eb204317"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:09:21 crc kubenswrapper[4764]: I1001 16:09:21.916506 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://50ed732785efefc18292368412dcb52035a50e6aac0d6b7e5cfa2693eb204317" gracePeriod=600 Oct 01 16:09:22 crc kubenswrapper[4764]: E1001 16:09:22.098710 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2068a381_c49b_41a4_bd0d_8c525f9b30d0.slice/crio-conmon-50ed732785efefc18292368412dcb52035a50e6aac0d6b7e5cfa2693eb204317.scope\": RecentStats: unable to find data in memory cache]" Oct 01 16:09:22 crc kubenswrapper[4764]: I1001 16:09:22.252712 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="50ed732785efefc18292368412dcb52035a50e6aac0d6b7e5cfa2693eb204317" exitCode=0 Oct 01 16:09:22 crc kubenswrapper[4764]: I1001 16:09:22.252759 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"50ed732785efefc18292368412dcb52035a50e6aac0d6b7e5cfa2693eb204317"} Oct 01 16:09:22 crc kubenswrapper[4764]: I1001 16:09:22.252794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"c9d37c73ed33c3edc83cd30171905ddb550eb174fab70b91e5c87cb08088ccc7"} Oct 01 16:09:22 crc kubenswrapper[4764]: I1001 16:09:22.252811 4764 scope.go:117] "RemoveContainer" containerID="be4594173ed06c387f9ed1021d035129b808518275e02a55a8fd6fbdeefacd65" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.216955 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jwxdk"] Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.218078 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.234469 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jwxdk"] Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.306464 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c87593fb-4b21-4220-9b88-55bcd2e84c98-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.306513 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c87593fb-4b21-4220-9b88-55bcd2e84c98-bound-sa-token\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.306538 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c87593fb-4b21-4220-9b88-55bcd2e84c98-registry-certificates\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.306553 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wqj7\" (UniqueName: \"kubernetes.io/projected/c87593fb-4b21-4220-9b88-55bcd2e84c98-kube-api-access-4wqj7\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.306570 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c87593fb-4b21-4220-9b88-55bcd2e84c98-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.306599 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.306626 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c87593fb-4b21-4220-9b88-55bcd2e84c98-registry-tls\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.306650 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c87593fb-4b21-4220-9b88-55bcd2e84c98-trusted-ca\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.342240 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.407485 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c87593fb-4b21-4220-9b88-55bcd2e84c98-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.407543 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c87593fb-4b21-4220-9b88-55bcd2e84c98-bound-sa-token\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.407565 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wqj7\" (UniqueName: \"kubernetes.io/projected/c87593fb-4b21-4220-9b88-55bcd2e84c98-kube-api-access-4wqj7\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.407590 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c87593fb-4b21-4220-9b88-55bcd2e84c98-registry-certificates\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.407615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c87593fb-4b21-4220-9b88-55bcd2e84c98-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.407668 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c87593fb-4b21-4220-9b88-55bcd2e84c98-registry-tls\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.407700 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c87593fb-4b21-4220-9b88-55bcd2e84c98-trusted-ca\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.408341 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c87593fb-4b21-4220-9b88-55bcd2e84c98-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.408923 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c87593fb-4b21-4220-9b88-55bcd2e84c98-registry-certificates\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.409523 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c87593fb-4b21-4220-9b88-55bcd2e84c98-trusted-ca\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.415003 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c87593fb-4b21-4220-9b88-55bcd2e84c98-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.415046 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c87593fb-4b21-4220-9b88-55bcd2e84c98-registry-tls\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.422762 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c87593fb-4b21-4220-9b88-55bcd2e84c98-bound-sa-token\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.439263 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wqj7\" (UniqueName: \"kubernetes.io/projected/c87593fb-4b21-4220-9b88-55bcd2e84c98-kube-api-access-4wqj7\") pod \"image-registry-66df7c8f76-jwxdk\" (UID: \"c87593fb-4b21-4220-9b88-55bcd2e84c98\") " pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.539501 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:04 crc kubenswrapper[4764]: I1001 16:10:04.745154 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jwxdk"] Oct 01 16:10:05 crc kubenswrapper[4764]: I1001 16:10:05.518246 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" event={"ID":"c87593fb-4b21-4220-9b88-55bcd2e84c98","Type":"ContainerStarted","Data":"d247dc5885db4fd42963af0b45cbf3332d851517e0381ec0fe810fed413f7457"} Oct 01 16:10:05 crc kubenswrapper[4764]: I1001 16:10:05.518567 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" event={"ID":"c87593fb-4b21-4220-9b88-55bcd2e84c98","Type":"ContainerStarted","Data":"482e3d08563cc4aedfc5a2c7ee9921a3e83d1efd83086033e6a9c70a588e7998"} Oct 01 16:10:05 crc kubenswrapper[4764]: I1001 16:10:05.519172 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:24 crc kubenswrapper[4764]: I1001 16:10:24.549024 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" Oct 01 16:10:24 crc kubenswrapper[4764]: I1001 16:10:24.580289 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-jwxdk" podStartSLOduration=20.580258454 podStartE2EDuration="20.580258454s" podCreationTimestamp="2025-10-01 16:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:10:05.536479987 +0000 UTC m=+468.536126892" watchObservedRunningTime="2025-10-01 16:10:24.580258454 +0000 UTC m=+487.579905319" Oct 01 16:10:24 crc kubenswrapper[4764]: I1001 16:10:24.622905 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m4gx9"] Oct 01 16:10:49 crc kubenswrapper[4764]: I1001 16:10:49.675514 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" podUID="a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" containerName="registry" containerID="cri-o://b5b390ab2a2764d7c618507c77aeb49f0680efb7d1f9f79542ee0a163198ca72" gracePeriod=30 Oct 01 16:10:49 crc kubenswrapper[4764]: I1001 16:10:49.821096 4764 generic.go:334] "Generic (PLEG): container finished" podID="a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" containerID="b5b390ab2a2764d7c618507c77aeb49f0680efb7d1f9f79542ee0a163198ca72" exitCode=0 Oct 01 16:10:49 crc kubenswrapper[4764]: I1001 16:10:49.821139 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" event={"ID":"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8","Type":"ContainerDied","Data":"b5b390ab2a2764d7c618507c77aeb49f0680efb7d1f9f79542ee0a163198ca72"} Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.013986 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.061609 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-registry-tls\") pod \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.061689 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-installation-pull-secrets\") pod \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.061727 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-ca-trust-extracted\") pod \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.061858 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.061906 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpj9c\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-kube-api-access-qpj9c\") pod \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.061940 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-registry-certificates\") pod \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.061957 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-bound-sa-token\") pod \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.061982 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-trusted-ca\") pod \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\" (UID: \"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8\") " Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.062929 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.063379 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.074469 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.074708 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.074982 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.075449 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-kube-api-access-qpj9c" (OuterVolumeSpecName: "kube-api-access-qpj9c") pod "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8"). InnerVolumeSpecName "kube-api-access-qpj9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.076179 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.089938 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" (UID: "a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.162844 4764 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.162881 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpj9c\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-kube-api-access-qpj9c\") on node \"crc\" DevicePath \"\"" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.162898 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.162909 4764 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.162920 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.162931 4764 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.162941 4764 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.831033 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" event={"ID":"a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8","Type":"ContainerDied","Data":"33a6b547337bbeac58b61e3a55156bc34839c387d6343a85b356652acd87b412"} Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.831169 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m4gx9" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.831193 4764 scope.go:117] "RemoveContainer" containerID="b5b390ab2a2764d7c618507c77aeb49f0680efb7d1f9f79542ee0a163198ca72" Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.878794 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m4gx9"] Oct 01 16:10:50 crc kubenswrapper[4764]: I1001 16:10:50.887250 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m4gx9"] Oct 01 16:10:51 crc kubenswrapper[4764]: I1001 16:10:51.729791 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" path="/var/lib/kubelet/pods/a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8/volumes" Oct 01 16:11:51 crc kubenswrapper[4764]: I1001 16:11:51.914092 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:11:51 crc kubenswrapper[4764]: I1001 16:11:51.914999 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.092812 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-q78nt"] Oct 01 16:12:08 crc kubenswrapper[4764]: E1001 16:12:08.093491 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" containerName="registry" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.093504 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" containerName="registry" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.110399 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e84d4e-1676-4857-ad1d-76ee4bcaf7c8" containerName="registry" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.114213 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-q78nt" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.117814 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-xdtjw" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.119327 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.119481 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.120100 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xl9g4"] Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.123645 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-xl9g4" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.126209 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-q78nt"] Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.126833 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-wrz8w" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.137847 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-pxbtx"] Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.138524 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-pxbtx" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.140680 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-8pr9z" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.149957 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xl9g4"] Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.154827 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-pxbtx"] Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.309180 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ljqx\" (UniqueName: \"kubernetes.io/projected/9eb7dd6e-2f03-4db5-9564-d87513d69d6b-kube-api-access-8ljqx\") pod \"cert-manager-webhook-5655c58dd6-pxbtx\" (UID: \"9eb7dd6e-2f03-4db5-9564-d87513d69d6b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-pxbtx" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.309253 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x4hd\" (UniqueName: \"kubernetes.io/projected/b22968b8-7419-48e2-9fab-a54611dcecad-kube-api-access-9x4hd\") pod \"cert-manager-5b446d88c5-xl9g4\" (UID: \"b22968b8-7419-48e2-9fab-a54611dcecad\") " pod="cert-manager/cert-manager-5b446d88c5-xl9g4" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.309285 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmh78\" (UniqueName: \"kubernetes.io/projected/0e72f60f-a975-4370-877f-5d5ba3c7c0b3-kube-api-access-jmh78\") pod \"cert-manager-cainjector-7f985d654d-q78nt\" (UID: \"0e72f60f-a975-4370-877f-5d5ba3c7c0b3\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-q78nt" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.410276 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ljqx\" (UniqueName: \"kubernetes.io/projected/9eb7dd6e-2f03-4db5-9564-d87513d69d6b-kube-api-access-8ljqx\") pod \"cert-manager-webhook-5655c58dd6-pxbtx\" (UID: \"9eb7dd6e-2f03-4db5-9564-d87513d69d6b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-pxbtx" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.410373 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x4hd\" (UniqueName: \"kubernetes.io/projected/b22968b8-7419-48e2-9fab-a54611dcecad-kube-api-access-9x4hd\") pod \"cert-manager-5b446d88c5-xl9g4\" (UID: \"b22968b8-7419-48e2-9fab-a54611dcecad\") " pod="cert-manager/cert-manager-5b446d88c5-xl9g4" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.410408 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmh78\" (UniqueName: \"kubernetes.io/projected/0e72f60f-a975-4370-877f-5d5ba3c7c0b3-kube-api-access-jmh78\") pod \"cert-manager-cainjector-7f985d654d-q78nt\" (UID: \"0e72f60f-a975-4370-877f-5d5ba3c7c0b3\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-q78nt" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.428460 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ljqx\" (UniqueName: \"kubernetes.io/projected/9eb7dd6e-2f03-4db5-9564-d87513d69d6b-kube-api-access-8ljqx\") pod \"cert-manager-webhook-5655c58dd6-pxbtx\" (UID: \"9eb7dd6e-2f03-4db5-9564-d87513d69d6b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-pxbtx" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.429801 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x4hd\" (UniqueName: \"kubernetes.io/projected/b22968b8-7419-48e2-9fab-a54611dcecad-kube-api-access-9x4hd\") pod \"cert-manager-5b446d88c5-xl9g4\" (UID: \"b22968b8-7419-48e2-9fab-a54611dcecad\") " pod="cert-manager/cert-manager-5b446d88c5-xl9g4" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.438379 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmh78\" (UniqueName: \"kubernetes.io/projected/0e72f60f-a975-4370-877f-5d5ba3c7c0b3-kube-api-access-jmh78\") pod \"cert-manager-cainjector-7f985d654d-q78nt\" (UID: \"0e72f60f-a975-4370-877f-5d5ba3c7c0b3\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-q78nt" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.449857 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-xl9g4" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.458274 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-pxbtx" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.656780 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xl9g4"] Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.664034 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.697922 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-pxbtx"] Oct 01 16:12:08 crc kubenswrapper[4764]: W1001 16:12:08.702039 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9eb7dd6e_2f03_4db5_9564_d87513d69d6b.slice/crio-40095ba81812613819f4a511865941f7bca185448eba5f702e3e98ad54285884 WatchSource:0}: Error finding container 40095ba81812613819f4a511865941f7bca185448eba5f702e3e98ad54285884: Status 404 returned error can't find the container with id 40095ba81812613819f4a511865941f7bca185448eba5f702e3e98ad54285884 Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.738259 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-q78nt" Oct 01 16:12:08 crc kubenswrapper[4764]: I1001 16:12:08.910106 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-q78nt"] Oct 01 16:12:09 crc kubenswrapper[4764]: I1001 16:12:09.316060 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-q78nt" event={"ID":"0e72f60f-a975-4370-877f-5d5ba3c7c0b3","Type":"ContainerStarted","Data":"e9ca211b46feeae17c0a2c6a83a8bdff032ee82134e22daa51a0339dd14a9639"} Oct 01 16:12:09 crc kubenswrapper[4764]: I1001 16:12:09.317008 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-pxbtx" event={"ID":"9eb7dd6e-2f03-4db5-9564-d87513d69d6b","Type":"ContainerStarted","Data":"40095ba81812613819f4a511865941f7bca185448eba5f702e3e98ad54285884"} Oct 01 16:12:09 crc kubenswrapper[4764]: I1001 16:12:09.318516 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-xl9g4" event={"ID":"b22968b8-7419-48e2-9fab-a54611dcecad","Type":"ContainerStarted","Data":"2fe9746dd2e03b11a061f3b19c972be41abb92ac8d41e1513e4fb072389c30bd"} Oct 01 16:12:13 crc kubenswrapper[4764]: I1001 16:12:13.369204 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-pxbtx" event={"ID":"9eb7dd6e-2f03-4db5-9564-d87513d69d6b","Type":"ContainerStarted","Data":"7a8403e163893f02b7bc589304801b6a078d8c85427e33fdf25bee4e21b77dad"} Oct 01 16:12:13 crc kubenswrapper[4764]: I1001 16:12:13.370893 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-pxbtx" Oct 01 16:12:13 crc kubenswrapper[4764]: I1001 16:12:13.370979 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-xl9g4" event={"ID":"b22968b8-7419-48e2-9fab-a54611dcecad","Type":"ContainerStarted","Data":"00bcd18def7b1cc57be13d3c7e96d2f14440c29eebf52a93b4fbac2bbfa2f537"} Oct 01 16:12:13 crc kubenswrapper[4764]: I1001 16:12:13.372270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-q78nt" event={"ID":"0e72f60f-a975-4370-877f-5d5ba3c7c0b3","Type":"ContainerStarted","Data":"900769d9712d4681365440a1abb03953efdd1c585985bd288d63dc1b33fe3b88"} Oct 01 16:12:13 crc kubenswrapper[4764]: I1001 16:12:13.385665 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-pxbtx" podStartSLOduration=1.793137424 podStartE2EDuration="5.385647494s" podCreationTimestamp="2025-10-01 16:12:08 +0000 UTC" firstStartedPulling="2025-10-01 16:12:08.704740684 +0000 UTC m=+591.704387519" lastFinishedPulling="2025-10-01 16:12:12.297250754 +0000 UTC m=+595.296897589" observedRunningTime="2025-10-01 16:12:13.383620204 +0000 UTC m=+596.383267039" watchObservedRunningTime="2025-10-01 16:12:13.385647494 +0000 UTC m=+596.385294329" Oct 01 16:12:13 crc kubenswrapper[4764]: I1001 16:12:13.401087 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-q78nt" podStartSLOduration=2.082122292 podStartE2EDuration="5.401042364s" podCreationTimestamp="2025-10-01 16:12:08 +0000 UTC" firstStartedPulling="2025-10-01 16:12:08.920767102 +0000 UTC m=+591.920413937" lastFinishedPulling="2025-10-01 16:12:12.239687174 +0000 UTC m=+595.239334009" observedRunningTime="2025-10-01 16:12:13.397033825 +0000 UTC m=+596.396680700" watchObservedRunningTime="2025-10-01 16:12:13.401042364 +0000 UTC m=+596.400689239" Oct 01 16:12:13 crc kubenswrapper[4764]: I1001 16:12:13.413413 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-xl9g4" podStartSLOduration=3.059325608 podStartE2EDuration="5.413387559s" podCreationTimestamp="2025-10-01 16:12:08 +0000 UTC" firstStartedPulling="2025-10-01 16:12:08.663825874 +0000 UTC m=+591.663472709" lastFinishedPulling="2025-10-01 16:12:11.017887825 +0000 UTC m=+594.017534660" observedRunningTime="2025-10-01 16:12:13.410280481 +0000 UTC m=+596.409927336" watchObservedRunningTime="2025-10-01 16:12:13.413387559 +0000 UTC m=+596.413034414" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.462989 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-pxbtx" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.563850 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fngxf"] Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.565209 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovn-controller" containerID="cri-o://df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a" gracePeriod=30 Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.565300 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="northd" containerID="cri-o://98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194" gracePeriod=30 Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.565312 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="kube-rbac-proxy-node" containerID="cri-o://33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199" gracePeriod=30 Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.565356 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="sbdb" containerID="cri-o://4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6" gracePeriod=30 Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.565394 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovn-acl-logging" containerID="cri-o://5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d" gracePeriod=30 Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.565415 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="nbdb" containerID="cri-o://461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f" gracePeriod=30 Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.565398 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5" gracePeriod=30 Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.600261 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" containerID="cri-o://28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d" gracePeriod=30 Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.856699 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/3.log" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.858538 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovn-acl-logging/0.log" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.858989 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovn-controller/0.log" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.859364 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.914209 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xxpkp"] Oct 01 16:12:18 crc kubenswrapper[4764]: E1001 16:12:18.914532 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovn-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.914563 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovn-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: E1001 16:12:18.914581 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="nbdb" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.914597 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="nbdb" Oct 01 16:12:18 crc kubenswrapper[4764]: E1001 16:12:18.914612 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.914624 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: E1001 16:12:18.914639 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.914651 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: E1001 16:12:18.914675 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="kube-rbac-proxy-node" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.914690 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="kube-rbac-proxy-node" Oct 01 16:12:18 crc kubenswrapper[4764]: E1001 16:12:18.914715 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.914730 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 16:12:18 crc kubenswrapper[4764]: E1001 16:12:18.914759 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="northd" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.914778 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="northd" Oct 01 16:12:18 crc kubenswrapper[4764]: E1001 16:12:18.914803 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovn-acl-logging" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.914821 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovn-acl-logging" Oct 01 16:12:18 crc kubenswrapper[4764]: E1001 16:12:18.914843 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.914858 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: E1001 16:12:18.914878 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="kubecfg-setup" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.914891 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="kubecfg-setup" Oct 01 16:12:18 crc kubenswrapper[4764]: E1001 16:12:18.914908 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="sbdb" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.914922 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="sbdb" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.915198 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.915221 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovn-acl-logging" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.915244 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="kube-rbac-proxy-node" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.915264 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="northd" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.915283 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.915304 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.915323 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovn-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.915342 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="nbdb" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.915360 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.915374 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="sbdb" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.915393 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 16:12:18 crc kubenswrapper[4764]: E1001 16:12:18.915560 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.915577 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.915765 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: E1001 16:12:18.915991 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.916009 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerName="ovnkube-controller" Oct 01 16:12:18 crc kubenswrapper[4764]: I1001 16:12:18.918828 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.042517 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovn-node-metrics-cert\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.042585 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.042611 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-var-lib-openvswitch\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.042684 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.042735 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-openvswitch\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.042791 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.042817 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.042841 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovnkube-config\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.042866 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-slash\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.043086 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-slash" (OuterVolumeSpecName: "host-slash") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.043339 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.043401 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-kubelet\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.043475 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.043434 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-node-log\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.043537 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovnkube-script-lib\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044183 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-log-socket\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044229 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-run-ovn-kubernetes\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.043578 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-node-log" (OuterVolumeSpecName: "node-log") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044054 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044229 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-log-socket" (OuterVolumeSpecName: "log-socket") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044266 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-run-netns\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044285 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044305 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-systemd-units\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044316 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044333 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-cni-netd\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044343 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044361 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-etc-openvswitch\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044371 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044389 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-cni-bin\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044396 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044428 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-systemd\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044458 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zbvx\" (UniqueName: \"kubernetes.io/projected/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-kube-api-access-5zbvx\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044501 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-ovn\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044616 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-env-overrides\") pod \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\" (UID: \"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8\") " Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-run-systemd\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044884 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-node-log\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044500 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044908 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44e3f40a-8050-4a64-baa2-1cd24e0b620a-ovn-node-metrics-cert\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044930 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6xzm\" (UniqueName: \"kubernetes.io/projected/44e3f40a-8050-4a64-baa2-1cd24e0b620a-kube-api-access-r6xzm\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044954 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.044981 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-var-lib-openvswitch\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045012 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44e3f40a-8050-4a64-baa2-1cd24e0b620a-env-overrides\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045087 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045128 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-run-netns\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045158 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-slash\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045179 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-cni-bin\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44e3f40a-8050-4a64-baa2-1cd24e0b620a-ovnkube-config\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045260 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-log-socket\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-systemd-units\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045295 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-cni-netd\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045326 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-etc-openvswitch\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045343 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-run-ovn\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045368 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44e3f40a-8050-4a64-baa2-1cd24e0b620a-ovnkube-script-lib\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045389 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-run-openvswitch\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045412 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-kubelet\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045566 4764 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045587 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045603 4764 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045619 4764 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045633 4764 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045645 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045657 4764 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-slash\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045668 4764 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045680 4764 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-node-log\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045692 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045704 4764 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-log-socket\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045716 4764 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045728 4764 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045739 4764 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045750 4764 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045761 4764 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.045771 4764 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.048641 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-kube-api-access-5zbvx" (OuterVolumeSpecName: "kube-api-access-5zbvx") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "kube-api-access-5zbvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.048882 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.055824 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" (UID: "fe0fc1af-28a8-48cd-ba84-954c8e7de3e8"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146429 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-run-systemd\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146488 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-node-log\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146507 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44e3f40a-8050-4a64-baa2-1cd24e0b620a-ovn-node-metrics-cert\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146523 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6xzm\" (UniqueName: \"kubernetes.io/projected/44e3f40a-8050-4a64-baa2-1cd24e0b620a-kube-api-access-r6xzm\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146539 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146555 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-var-lib-openvswitch\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146572 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146597 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44e3f40a-8050-4a64-baa2-1cd24e0b620a-env-overrides\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146611 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-run-netns\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146606 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-run-systemd\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146668 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-node-log\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146675 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-slash\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146629 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-slash\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-cni-bin\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146789 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44e3f40a-8050-4a64-baa2-1cd24e0b620a-ovnkube-config\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146825 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-log-socket\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146843 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-systemd-units\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146844 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-var-lib-openvswitch\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146876 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-cni-netd\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-cni-netd\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146921 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-etc-openvswitch\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146936 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-cni-bin\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146943 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-run-ovn\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146970 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-run-openvswitch\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.146986 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44e3f40a-8050-4a64-baa2-1cd24e0b620a-ovnkube-script-lib\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147008 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-kubelet\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147087 4764 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147100 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zbvx\" (UniqueName: \"kubernetes.io/projected/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-kube-api-access-5zbvx\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147109 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-kubelet\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147261 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-log-socket\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147313 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147323 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-run-ovn\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-run-netns\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147362 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147391 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-systemd-units\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147393 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-run-openvswitch\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147396 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/44e3f40a-8050-4a64-baa2-1cd24e0b620a-etc-openvswitch\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147763 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44e3f40a-8050-4a64-baa2-1cd24e0b620a-ovnkube-config\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.147955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44e3f40a-8050-4a64-baa2-1cd24e0b620a-env-overrides\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.148216 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/44e3f40a-8050-4a64-baa2-1cd24e0b620a-ovnkube-script-lib\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.152667 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44e3f40a-8050-4a64-baa2-1cd24e0b620a-ovn-node-metrics-cert\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.168277 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6xzm\" (UniqueName: \"kubernetes.io/projected/44e3f40a-8050-4a64-baa2-1cd24e0b620a-kube-api-access-r6xzm\") pod \"ovnkube-node-xxpkp\" (UID: \"44e3f40a-8050-4a64-baa2-1cd24e0b620a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.235680 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:19 crc kubenswrapper[4764]: W1001 16:12:19.254088 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44e3f40a_8050_4a64_baa2_1cd24e0b620a.slice/crio-c6fc3ce57b8dc060b511ce3b5c35ac0d47ea417bfc9a107cf47f16d712102ebd WatchSource:0}: Error finding container c6fc3ce57b8dc060b511ce3b5c35ac0d47ea417bfc9a107cf47f16d712102ebd: Status 404 returned error can't find the container with id c6fc3ce57b8dc060b511ce3b5c35ac0d47ea417bfc9a107cf47f16d712102ebd Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.413259 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks425_5499b593-79e4-408e-a32b-9e132d3a0de7/kube-multus/2.log" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.413922 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks425_5499b593-79e4-408e-a32b-9e132d3a0de7/kube-multus/1.log" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.414036 4764 generic.go:334] "Generic (PLEG): container finished" podID="5499b593-79e4-408e-a32b-9e132d3a0de7" containerID="ebe948ccdf109b264c30c2e6b27c52173e08727669e0354529418595261bf85b" exitCode=2 Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.414082 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks425" event={"ID":"5499b593-79e4-408e-a32b-9e132d3a0de7","Type":"ContainerDied","Data":"ebe948ccdf109b264c30c2e6b27c52173e08727669e0354529418595261bf85b"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.414274 4764 scope.go:117] "RemoveContainer" containerID="a3387128461900e8f05e7f2f66414837632b948e04d7f63dce60c77a52dcd40d" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.414748 4764 scope.go:117] "RemoveContainer" containerID="ebe948ccdf109b264c30c2e6b27c52173e08727669e0354529418595261bf85b" Oct 01 16:12:19 crc kubenswrapper[4764]: E1001 16:12:19.414941 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-ks425_openshift-multus(5499b593-79e4-408e-a32b-9e132d3a0de7)\"" pod="openshift-multus/multus-ks425" podUID="5499b593-79e4-408e-a32b-9e132d3a0de7" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.417815 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovnkube-controller/3.log" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.421863 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovn-acl-logging/0.log" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.422388 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fngxf_fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/ovn-controller/0.log" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.422707 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerID="28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d" exitCode=0 Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.422795 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerID="4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6" exitCode=0 Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.422862 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerID="461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f" exitCode=0 Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.422919 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerID="98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194" exitCode=0 Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.422974 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerID="93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5" exitCode=0 Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.423027 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerID="33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199" exitCode=0 Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.423107 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerID="5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d" exitCode=143 Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.422820 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.423167 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" containerID="df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a" exitCode=143 Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.422738 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.423390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.423460 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.423536 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.423596 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.423660 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.423736 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.423791 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.423838 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.423894 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.423961 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424016 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424089 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424138 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424187 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424235 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424293 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424366 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424422 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424472 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424521 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424570 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424626 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424678 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424729 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424776 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424825 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424876 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424931 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.424983 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425077 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425132 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425181 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425230 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425285 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425341 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425479 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425554 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425623 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fngxf" event={"ID":"fe0fc1af-28a8-48cd-ba84-954c8e7de3e8","Type":"ContainerDied","Data":"25941f43ecde66b19a0c769d012677267d4d81919e7d78ec24241ede26c2549b"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425705 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425759 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425816 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425872 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425919 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425976 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.426031 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.426108 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.426171 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.426220 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.426276 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" event={"ID":"44e3f40a-8050-4a64-baa2-1cd24e0b620a","Type":"ContainerDied","Data":"a9f32a4e00b95f0f122ac32fa25cf949db5441a9fa11d4c3518fd9d886e7a19f"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.425243 4764 generic.go:334] "Generic (PLEG): container finished" podID="44e3f40a-8050-4a64-baa2-1cd24e0b620a" containerID="a9f32a4e00b95f0f122ac32fa25cf949db5441a9fa11d4c3518fd9d886e7a19f" exitCode=0 Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.426384 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" event={"ID":"44e3f40a-8050-4a64-baa2-1cd24e0b620a","Type":"ContainerStarted","Data":"c6fc3ce57b8dc060b511ce3b5c35ac0d47ea417bfc9a107cf47f16d712102ebd"} Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.470143 4764 scope.go:117] "RemoveContainer" containerID="28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.495923 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fngxf"] Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.498737 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fngxf"] Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.506981 4764 scope.go:117] "RemoveContainer" containerID="fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.531740 4764 scope.go:117] "RemoveContainer" containerID="4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.544177 4764 scope.go:117] "RemoveContainer" containerID="461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.568440 4764 scope.go:117] "RemoveContainer" containerID="98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.586895 4764 scope.go:117] "RemoveContainer" containerID="93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.601769 4764 scope.go:117] "RemoveContainer" containerID="33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.627764 4764 scope.go:117] "RemoveContainer" containerID="5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.646700 4764 scope.go:117] "RemoveContainer" containerID="df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.666130 4764 scope.go:117] "RemoveContainer" containerID="021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.683716 4764 scope.go:117] "RemoveContainer" containerID="28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d" Oct 01 16:12:19 crc kubenswrapper[4764]: E1001 16:12:19.684404 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d\": container with ID starting with 28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d not found: ID does not exist" containerID="28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.684443 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d"} err="failed to get container status \"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d\": rpc error: code = NotFound desc = could not find container \"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d\": container with ID starting with 28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.684476 4764 scope.go:117] "RemoveContainer" containerID="fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a" Oct 01 16:12:19 crc kubenswrapper[4764]: E1001 16:12:19.684902 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\": container with ID starting with fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a not found: ID does not exist" containerID="fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.684959 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a"} err="failed to get container status \"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\": rpc error: code = NotFound desc = could not find container \"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\": container with ID starting with fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.684975 4764 scope.go:117] "RemoveContainer" containerID="4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6" Oct 01 16:12:19 crc kubenswrapper[4764]: E1001 16:12:19.685248 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\": container with ID starting with 4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6 not found: ID does not exist" containerID="4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.685275 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6"} err="failed to get container status \"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\": rpc error: code = NotFound desc = could not find container \"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\": container with ID starting with 4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.685289 4764 scope.go:117] "RemoveContainer" containerID="461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f" Oct 01 16:12:19 crc kubenswrapper[4764]: E1001 16:12:19.685483 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\": container with ID starting with 461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f not found: ID does not exist" containerID="461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.685501 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f"} err="failed to get container status \"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\": rpc error: code = NotFound desc = could not find container \"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\": container with ID starting with 461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.685516 4764 scope.go:117] "RemoveContainer" containerID="98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194" Oct 01 16:12:19 crc kubenswrapper[4764]: E1001 16:12:19.685705 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\": container with ID starting with 98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194 not found: ID does not exist" containerID="98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.685723 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194"} err="failed to get container status \"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\": rpc error: code = NotFound desc = could not find container \"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\": container with ID starting with 98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.685735 4764 scope.go:117] "RemoveContainer" containerID="93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5" Oct 01 16:12:19 crc kubenswrapper[4764]: E1001 16:12:19.685901 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\": container with ID starting with 93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5 not found: ID does not exist" containerID="93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.685918 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5"} err="failed to get container status \"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\": rpc error: code = NotFound desc = could not find container \"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\": container with ID starting with 93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.685928 4764 scope.go:117] "RemoveContainer" containerID="33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199" Oct 01 16:12:19 crc kubenswrapper[4764]: E1001 16:12:19.686122 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\": container with ID starting with 33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199 not found: ID does not exist" containerID="33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.686137 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199"} err="failed to get container status \"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\": rpc error: code = NotFound desc = could not find container \"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\": container with ID starting with 33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.686162 4764 scope.go:117] "RemoveContainer" containerID="5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d" Oct 01 16:12:19 crc kubenswrapper[4764]: E1001 16:12:19.686333 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\": container with ID starting with 5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d not found: ID does not exist" containerID="5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.686350 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d"} err="failed to get container status \"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\": rpc error: code = NotFound desc = could not find container \"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\": container with ID starting with 5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.686362 4764 scope.go:117] "RemoveContainer" containerID="df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a" Oct 01 16:12:19 crc kubenswrapper[4764]: E1001 16:12:19.686549 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\": container with ID starting with df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a not found: ID does not exist" containerID="df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.686571 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a"} err="failed to get container status \"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\": rpc error: code = NotFound desc = could not find container \"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\": container with ID starting with df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.686583 4764 scope.go:117] "RemoveContainer" containerID="021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24" Oct 01 16:12:19 crc kubenswrapper[4764]: E1001 16:12:19.686776 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\": container with ID starting with 021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24 not found: ID does not exist" containerID="021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.686795 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24"} err="failed to get container status \"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\": rpc error: code = NotFound desc = could not find container \"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\": container with ID starting with 021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.686807 4764 scope.go:117] "RemoveContainer" containerID="28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.687000 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d"} err="failed to get container status \"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d\": rpc error: code = NotFound desc = could not find container \"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d\": container with ID starting with 28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.687021 4764 scope.go:117] "RemoveContainer" containerID="fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.687247 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a"} err="failed to get container status \"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\": rpc error: code = NotFound desc = could not find container \"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\": container with ID starting with fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.687264 4764 scope.go:117] "RemoveContainer" containerID="4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.687471 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6"} err="failed to get container status \"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\": rpc error: code = NotFound desc = could not find container \"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\": container with ID starting with 4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.687489 4764 scope.go:117] "RemoveContainer" containerID="461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.687820 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f"} err="failed to get container status \"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\": rpc error: code = NotFound desc = could not find container \"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\": container with ID starting with 461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.687838 4764 scope.go:117] "RemoveContainer" containerID="98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.688119 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194"} err="failed to get container status \"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\": rpc error: code = NotFound desc = could not find container \"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\": container with ID starting with 98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.688165 4764 scope.go:117] "RemoveContainer" containerID="93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.688408 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5"} err="failed to get container status \"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\": rpc error: code = NotFound desc = could not find container \"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\": container with ID starting with 93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.688426 4764 scope.go:117] "RemoveContainer" containerID="33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.688728 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199"} err="failed to get container status \"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\": rpc error: code = NotFound desc = could not find container \"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\": container with ID starting with 33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.688762 4764 scope.go:117] "RemoveContainer" containerID="5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.688982 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d"} err="failed to get container status \"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\": rpc error: code = NotFound desc = could not find container \"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\": container with ID starting with 5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.689000 4764 scope.go:117] "RemoveContainer" containerID="df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.689381 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a"} err="failed to get container status \"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\": rpc error: code = NotFound desc = could not find container \"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\": container with ID starting with df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.689398 4764 scope.go:117] "RemoveContainer" containerID="021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.689592 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24"} err="failed to get container status \"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\": rpc error: code = NotFound desc = could not find container \"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\": container with ID starting with 021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.689632 4764 scope.go:117] "RemoveContainer" containerID="28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.689815 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d"} err="failed to get container status \"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d\": rpc error: code = NotFound desc = could not find container \"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d\": container with ID starting with 28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.689831 4764 scope.go:117] "RemoveContainer" containerID="fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.690090 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a"} err="failed to get container status \"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\": rpc error: code = NotFound desc = could not find container \"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\": container with ID starting with fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.690109 4764 scope.go:117] "RemoveContainer" containerID="4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.690276 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6"} err="failed to get container status \"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\": rpc error: code = NotFound desc = could not find container \"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\": container with ID starting with 4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.690290 4764 scope.go:117] "RemoveContainer" containerID="461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.690441 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f"} err="failed to get container status \"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\": rpc error: code = NotFound desc = could not find container \"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\": container with ID starting with 461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.690455 4764 scope.go:117] "RemoveContainer" containerID="98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.690605 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194"} err="failed to get container status \"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\": rpc error: code = NotFound desc = could not find container \"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\": container with ID starting with 98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.690620 4764 scope.go:117] "RemoveContainer" containerID="93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.690768 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5"} err="failed to get container status \"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\": rpc error: code = NotFound desc = could not find container \"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\": container with ID starting with 93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.690783 4764 scope.go:117] "RemoveContainer" containerID="33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.690937 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199"} err="failed to get container status \"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\": rpc error: code = NotFound desc = could not find container \"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\": container with ID starting with 33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.690954 4764 scope.go:117] "RemoveContainer" containerID="5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.691205 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d"} err="failed to get container status \"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\": rpc error: code = NotFound desc = could not find container \"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\": container with ID starting with 5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.691223 4764 scope.go:117] "RemoveContainer" containerID="df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.691386 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a"} err="failed to get container status \"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\": rpc error: code = NotFound desc = could not find container \"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\": container with ID starting with df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.691404 4764 scope.go:117] "RemoveContainer" containerID="021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.691563 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24"} err="failed to get container status \"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\": rpc error: code = NotFound desc = could not find container \"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\": container with ID starting with 021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.691579 4764 scope.go:117] "RemoveContainer" containerID="28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.691732 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d"} err="failed to get container status \"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d\": rpc error: code = NotFound desc = could not find container \"28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d\": container with ID starting with 28f652b293c80e47226799aa7e617bf3704b51b46aaa6773df9d41281e62271d not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.691744 4764 scope.go:117] "RemoveContainer" containerID="fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.691899 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a"} err="failed to get container status \"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\": rpc error: code = NotFound desc = could not find container \"fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a\": container with ID starting with fcb36245891f3bf6d82e4a1b0484d6104b234218f0a724586915288584fc690a not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.691912 4764 scope.go:117] "RemoveContainer" containerID="4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.692119 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6"} err="failed to get container status \"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\": rpc error: code = NotFound desc = could not find container \"4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6\": container with ID starting with 4d2adecd1f1be46058c795cb730e80353f8a082a58f079ef7e897772b81bcda6 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.692133 4764 scope.go:117] "RemoveContainer" containerID="461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.692293 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f"} err="failed to get container status \"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\": rpc error: code = NotFound desc = could not find container \"461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f\": container with ID starting with 461e20904a84508a88374d81510ba2e44a0dfb0932b879c52988b8a8d282d44f not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.692305 4764 scope.go:117] "RemoveContainer" containerID="98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.692459 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194"} err="failed to get container status \"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\": rpc error: code = NotFound desc = could not find container \"98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194\": container with ID starting with 98bf25b647b0780422519ad785c00ccc0240f2ca10cc41d62dcf86d2e40e4194 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.692471 4764 scope.go:117] "RemoveContainer" containerID="93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.692650 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5"} err="failed to get container status \"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\": rpc error: code = NotFound desc = could not find container \"93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5\": container with ID starting with 93b1a2d5cc8347fc96b92fc5732f1bb1ecbda5c41db4a4897432f251049185b5 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.692679 4764 scope.go:117] "RemoveContainer" containerID="33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.692889 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199"} err="failed to get container status \"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\": rpc error: code = NotFound desc = could not find container \"33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199\": container with ID starting with 33e44e8497fce8ca1e0f42f1097eca830640ccde4b09216603f62c7b0f789199 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.692903 4764 scope.go:117] "RemoveContainer" containerID="5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.693268 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d"} err="failed to get container status \"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\": rpc error: code = NotFound desc = could not find container \"5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d\": container with ID starting with 5b1db68f503aafa73730c28130390ecb6e56e490b20af9226a29baa82ade943d not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.693282 4764 scope.go:117] "RemoveContainer" containerID="df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.693490 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a"} err="failed to get container status \"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\": rpc error: code = NotFound desc = could not find container \"df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a\": container with ID starting with df16f206c094e642f34a827c852d9f4965f1d5aabe48389e43c8f7295f03de5a not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.693505 4764 scope.go:117] "RemoveContainer" containerID="021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.693702 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24"} err="failed to get container status \"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\": rpc error: code = NotFound desc = could not find container \"021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24\": container with ID starting with 021613f8b927734a4e00126175b44abbb6ba8662dd730af0951453de9e3abc24 not found: ID does not exist" Oct 01 16:12:19 crc kubenswrapper[4764]: I1001 16:12:19.732315 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe0fc1af-28a8-48cd-ba84-954c8e7de3e8" path="/var/lib/kubelet/pods/fe0fc1af-28a8-48cd-ba84-954c8e7de3e8/volumes" Oct 01 16:12:20 crc kubenswrapper[4764]: I1001 16:12:20.441333 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" event={"ID":"44e3f40a-8050-4a64-baa2-1cd24e0b620a","Type":"ContainerStarted","Data":"a6e100acc7e3013fd98b85c8a781e58b6fa7c85a988b62a89d5fcc24cc43df73"} Oct 01 16:12:20 crc kubenswrapper[4764]: I1001 16:12:20.441657 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" event={"ID":"44e3f40a-8050-4a64-baa2-1cd24e0b620a","Type":"ContainerStarted","Data":"10b6ccb6550f10f5b31db7be87a35b25e77b7ddab538660f55475c4b502ce495"} Oct 01 16:12:20 crc kubenswrapper[4764]: I1001 16:12:20.441678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" event={"ID":"44e3f40a-8050-4a64-baa2-1cd24e0b620a","Type":"ContainerStarted","Data":"a7d792a0a87fab9083942f988e5604e90fbef6c6748582015a4c5a27df45d494"} Oct 01 16:12:20 crc kubenswrapper[4764]: I1001 16:12:20.441697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" event={"ID":"44e3f40a-8050-4a64-baa2-1cd24e0b620a","Type":"ContainerStarted","Data":"72009652f645eb1d97d86fa3653e2f1a89bf8bb7859afebfe4e3dee1cc012b50"} Oct 01 16:12:20 crc kubenswrapper[4764]: I1001 16:12:20.441714 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" event={"ID":"44e3f40a-8050-4a64-baa2-1cd24e0b620a","Type":"ContainerStarted","Data":"0c6ee29cdb1ab40df3c12246519b7fa5336efb40c428a41e8ad6040ab3246199"} Oct 01 16:12:20 crc kubenswrapper[4764]: I1001 16:12:20.441734 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" event={"ID":"44e3f40a-8050-4a64-baa2-1cd24e0b620a","Type":"ContainerStarted","Data":"dc7875d03579eaa911ecb0a3655267f6954d4c7e82c1456655799ba97f290302"} Oct 01 16:12:20 crc kubenswrapper[4764]: I1001 16:12:20.444398 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks425_5499b593-79e4-408e-a32b-9e132d3a0de7/kube-multus/2.log" Oct 01 16:12:21 crc kubenswrapper[4764]: I1001 16:12:21.914384 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:12:21 crc kubenswrapper[4764]: I1001 16:12:21.914442 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:12:23 crc kubenswrapper[4764]: I1001 16:12:23.482596 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" event={"ID":"44e3f40a-8050-4a64-baa2-1cd24e0b620a","Type":"ContainerStarted","Data":"e662f655ca2996e01161458dcaa899c9b8bf5c381215c6f034e1cd5651c84944"} Oct 01 16:12:25 crc kubenswrapper[4764]: I1001 16:12:25.497986 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" event={"ID":"44e3f40a-8050-4a64-baa2-1cd24e0b620a","Type":"ContainerStarted","Data":"c9d5b4ba5aeae5a0acd21cdf70f04296d1957e64e83caecf8cc28c18c4e1da76"} Oct 01 16:12:25 crc kubenswrapper[4764]: I1001 16:12:25.498536 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:25 crc kubenswrapper[4764]: I1001 16:12:25.498549 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:25 crc kubenswrapper[4764]: I1001 16:12:25.498559 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:25 crc kubenswrapper[4764]: I1001 16:12:25.524315 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" podStartSLOduration=7.524295835 podStartE2EDuration="7.524295835s" podCreationTimestamp="2025-10-01 16:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:12:25.522757467 +0000 UTC m=+608.522404322" watchObservedRunningTime="2025-10-01 16:12:25.524295835 +0000 UTC m=+608.523942670" Oct 01 16:12:25 crc kubenswrapper[4764]: I1001 16:12:25.528330 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:25 crc kubenswrapper[4764]: I1001 16:12:25.535311 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:34 crc kubenswrapper[4764]: I1001 16:12:34.722462 4764 scope.go:117] "RemoveContainer" containerID="ebe948ccdf109b264c30c2e6b27c52173e08727669e0354529418595261bf85b" Oct 01 16:12:34 crc kubenswrapper[4764]: E1001 16:12:34.723554 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-ks425_openshift-multus(5499b593-79e4-408e-a32b-9e132d3a0de7)\"" pod="openshift-multus/multus-ks425" podUID="5499b593-79e4-408e-a32b-9e132d3a0de7" Oct 01 16:12:46 crc kubenswrapper[4764]: I1001 16:12:46.722468 4764 scope.go:117] "RemoveContainer" containerID="ebe948ccdf109b264c30c2e6b27c52173e08727669e0354529418595261bf85b" Oct 01 16:12:47 crc kubenswrapper[4764]: I1001 16:12:47.637887 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ks425_5499b593-79e4-408e-a32b-9e132d3a0de7/kube-multus/2.log" Oct 01 16:12:47 crc kubenswrapper[4764]: I1001 16:12:47.638242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ks425" event={"ID":"5499b593-79e4-408e-a32b-9e132d3a0de7","Type":"ContainerStarted","Data":"25a06d5432b2e5bd120bdf24a85a0d5470e10e0f979f9750c9b5aad092e7225c"} Oct 01 16:12:49 crc kubenswrapper[4764]: I1001 16:12:49.267069 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xxpkp" Oct 01 16:12:51 crc kubenswrapper[4764]: I1001 16:12:51.913880 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:12:51 crc kubenswrapper[4764]: I1001 16:12:51.914497 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:12:51 crc kubenswrapper[4764]: I1001 16:12:51.914542 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:12:51 crc kubenswrapper[4764]: I1001 16:12:51.915555 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9d37c73ed33c3edc83cd30171905ddb550eb174fab70b91e5c87cb08088ccc7"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:12:51 crc kubenswrapper[4764]: I1001 16:12:51.915625 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://c9d37c73ed33c3edc83cd30171905ddb550eb174fab70b91e5c87cb08088ccc7" gracePeriod=600 Oct 01 16:12:52 crc kubenswrapper[4764]: I1001 16:12:52.670348 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="c9d37c73ed33c3edc83cd30171905ddb550eb174fab70b91e5c87cb08088ccc7" exitCode=0 Oct 01 16:12:52 crc kubenswrapper[4764]: I1001 16:12:52.670428 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"c9d37c73ed33c3edc83cd30171905ddb550eb174fab70b91e5c87cb08088ccc7"} Oct 01 16:12:52 crc kubenswrapper[4764]: I1001 16:12:52.670752 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"36994ceb1acaf44344047ef2a5795d007fa57999fff00a6c7967859219769b54"} Oct 01 16:12:52 crc kubenswrapper[4764]: I1001 16:12:52.670780 4764 scope.go:117] "RemoveContainer" containerID="50ed732785efefc18292368412dcb52035a50e6aac0d6b7e5cfa2693eb204317" Oct 01 16:12:57 crc kubenswrapper[4764]: I1001 16:12:57.746697 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw"] Oct 01 16:12:57 crc kubenswrapper[4764]: I1001 16:12:57.748454 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" Oct 01 16:12:57 crc kubenswrapper[4764]: I1001 16:12:57.750980 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 16:12:57 crc kubenswrapper[4764]: I1001 16:12:57.763951 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw"] Oct 01 16:12:57 crc kubenswrapper[4764]: I1001 16:12:57.874138 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9nnr\" (UniqueName: \"kubernetes.io/projected/da3eb657-76ae-4f69-9c49-51a7dfa7f054-kube-api-access-t9nnr\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw\" (UID: \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" Oct 01 16:12:57 crc kubenswrapper[4764]: I1001 16:12:57.875272 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da3eb657-76ae-4f69-9c49-51a7dfa7f054-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw\" (UID: \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" Oct 01 16:12:57 crc kubenswrapper[4764]: I1001 16:12:57.875327 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da3eb657-76ae-4f69-9c49-51a7dfa7f054-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw\" (UID: \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" Oct 01 16:12:57 crc kubenswrapper[4764]: I1001 16:12:57.976276 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da3eb657-76ae-4f69-9c49-51a7dfa7f054-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw\" (UID: \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" Oct 01 16:12:57 crc kubenswrapper[4764]: I1001 16:12:57.976346 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da3eb657-76ae-4f69-9c49-51a7dfa7f054-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw\" (UID: \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" Oct 01 16:12:57 crc kubenswrapper[4764]: I1001 16:12:57.976445 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9nnr\" (UniqueName: \"kubernetes.io/projected/da3eb657-76ae-4f69-9c49-51a7dfa7f054-kube-api-access-t9nnr\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw\" (UID: \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" Oct 01 16:12:57 crc kubenswrapper[4764]: I1001 16:12:57.976917 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da3eb657-76ae-4f69-9c49-51a7dfa7f054-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw\" (UID: \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" Oct 01 16:12:57 crc kubenswrapper[4764]: I1001 16:12:57.977177 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da3eb657-76ae-4f69-9c49-51a7dfa7f054-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw\" (UID: \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" Oct 01 16:12:58 crc kubenswrapper[4764]: I1001 16:12:58.003146 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9nnr\" (UniqueName: \"kubernetes.io/projected/da3eb657-76ae-4f69-9c49-51a7dfa7f054-kube-api-access-t9nnr\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw\" (UID: \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" Oct 01 16:12:58 crc kubenswrapper[4764]: I1001 16:12:58.066142 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" Oct 01 16:12:58 crc kubenswrapper[4764]: I1001 16:12:58.302263 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw"] Oct 01 16:12:58 crc kubenswrapper[4764]: I1001 16:12:58.713090 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" event={"ID":"da3eb657-76ae-4f69-9c49-51a7dfa7f054","Type":"ContainerStarted","Data":"2e8f92187419be725e6ba3ce78041466633019541d36804bfaf518f1f02a8013"} Oct 01 16:12:58 crc kubenswrapper[4764]: I1001 16:12:58.713534 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" event={"ID":"da3eb657-76ae-4f69-9c49-51a7dfa7f054","Type":"ContainerStarted","Data":"3150204d89ce0295b4b5414823aa918f0cf2e2d4fb52a4172dfdf0ff1990c683"} Oct 01 16:12:59 crc kubenswrapper[4764]: I1001 16:12:59.724425 4764 generic.go:334] "Generic (PLEG): container finished" podID="da3eb657-76ae-4f69-9c49-51a7dfa7f054" containerID="2e8f92187419be725e6ba3ce78041466633019541d36804bfaf518f1f02a8013" exitCode=0 Oct 01 16:12:59 crc kubenswrapper[4764]: I1001 16:12:59.742943 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" event={"ID":"da3eb657-76ae-4f69-9c49-51a7dfa7f054","Type":"ContainerDied","Data":"2e8f92187419be725e6ba3ce78041466633019541d36804bfaf518f1f02a8013"} Oct 01 16:13:01 crc kubenswrapper[4764]: I1001 16:13:01.739990 4764 generic.go:334] "Generic (PLEG): container finished" podID="da3eb657-76ae-4f69-9c49-51a7dfa7f054" containerID="195bf2e8cf888060d8779f82e6210c48eeaa916fd41deb94e4a369db36c39e7c" exitCode=0 Oct 01 16:13:01 crc kubenswrapper[4764]: I1001 16:13:01.740102 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" event={"ID":"da3eb657-76ae-4f69-9c49-51a7dfa7f054","Type":"ContainerDied","Data":"195bf2e8cf888060d8779f82e6210c48eeaa916fd41deb94e4a369db36c39e7c"} Oct 01 16:13:02 crc kubenswrapper[4764]: I1001 16:13:02.753299 4764 generic.go:334] "Generic (PLEG): container finished" podID="da3eb657-76ae-4f69-9c49-51a7dfa7f054" containerID="75191c4be40920a1e40dc208b7feb5c6e519b66a2375ce7b65a67374715b1327" exitCode=0 Oct 01 16:13:02 crc kubenswrapper[4764]: I1001 16:13:02.753354 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" event={"ID":"da3eb657-76ae-4f69-9c49-51a7dfa7f054","Type":"ContainerDied","Data":"75191c4be40920a1e40dc208b7feb5c6e519b66a2375ce7b65a67374715b1327"} Oct 01 16:13:04 crc kubenswrapper[4764]: I1001 16:13:04.043221 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" Oct 01 16:13:04 crc kubenswrapper[4764]: I1001 16:13:04.166373 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da3eb657-76ae-4f69-9c49-51a7dfa7f054-util\") pod \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\" (UID: \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\") " Oct 01 16:13:04 crc kubenswrapper[4764]: I1001 16:13:04.166487 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9nnr\" (UniqueName: \"kubernetes.io/projected/da3eb657-76ae-4f69-9c49-51a7dfa7f054-kube-api-access-t9nnr\") pod \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\" (UID: \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\") " Oct 01 16:13:04 crc kubenswrapper[4764]: I1001 16:13:04.167239 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da3eb657-76ae-4f69-9c49-51a7dfa7f054-bundle\") pod \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\" (UID: \"da3eb657-76ae-4f69-9c49-51a7dfa7f054\") " Oct 01 16:13:04 crc kubenswrapper[4764]: I1001 16:13:04.168324 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da3eb657-76ae-4f69-9c49-51a7dfa7f054-bundle" (OuterVolumeSpecName: "bundle") pod "da3eb657-76ae-4f69-9c49-51a7dfa7f054" (UID: "da3eb657-76ae-4f69-9c49-51a7dfa7f054"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:13:04 crc kubenswrapper[4764]: I1001 16:13:04.172407 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da3eb657-76ae-4f69-9c49-51a7dfa7f054-kube-api-access-t9nnr" (OuterVolumeSpecName: "kube-api-access-t9nnr") pod "da3eb657-76ae-4f69-9c49-51a7dfa7f054" (UID: "da3eb657-76ae-4f69-9c49-51a7dfa7f054"). InnerVolumeSpecName "kube-api-access-t9nnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:13:04 crc kubenswrapper[4764]: I1001 16:13:04.187314 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da3eb657-76ae-4f69-9c49-51a7dfa7f054-util" (OuterVolumeSpecName: "util") pod "da3eb657-76ae-4f69-9c49-51a7dfa7f054" (UID: "da3eb657-76ae-4f69-9c49-51a7dfa7f054"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:13:04 crc kubenswrapper[4764]: I1001 16:13:04.269507 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/da3eb657-76ae-4f69-9c49-51a7dfa7f054-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:13:04 crc kubenswrapper[4764]: I1001 16:13:04.269557 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/da3eb657-76ae-4f69-9c49-51a7dfa7f054-util\") on node \"crc\" DevicePath \"\"" Oct 01 16:13:04 crc kubenswrapper[4764]: I1001 16:13:04.269577 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9nnr\" (UniqueName: \"kubernetes.io/projected/da3eb657-76ae-4f69-9c49-51a7dfa7f054-kube-api-access-t9nnr\") on node \"crc\" DevicePath \"\"" Oct 01 16:13:04 crc kubenswrapper[4764]: I1001 16:13:04.769369 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" event={"ID":"da3eb657-76ae-4f69-9c49-51a7dfa7f054","Type":"ContainerDied","Data":"3150204d89ce0295b4b5414823aa918f0cf2e2d4fb52a4172dfdf0ff1990c683"} Oct 01 16:13:04 crc kubenswrapper[4764]: I1001 16:13:04.769406 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3150204d89ce0295b4b5414823aa918f0cf2e2d4fb52a4172dfdf0ff1990c683" Oct 01 16:13:04 crc kubenswrapper[4764]: I1001 16:13:04.769449 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.218121 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d"] Oct 01 16:13:06 crc kubenswrapper[4764]: E1001 16:13:06.218691 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3eb657-76ae-4f69-9c49-51a7dfa7f054" containerName="util" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.218706 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3eb657-76ae-4f69-9c49-51a7dfa7f054" containerName="util" Oct 01 16:13:06 crc kubenswrapper[4764]: E1001 16:13:06.218719 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3eb657-76ae-4f69-9c49-51a7dfa7f054" containerName="pull" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.218726 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3eb657-76ae-4f69-9c49-51a7dfa7f054" containerName="pull" Oct 01 16:13:06 crc kubenswrapper[4764]: E1001 16:13:06.218742 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3eb657-76ae-4f69-9c49-51a7dfa7f054" containerName="extract" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.218749 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3eb657-76ae-4f69-9c49-51a7dfa7f054" containerName="extract" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.218886 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="da3eb657-76ae-4f69-9c49-51a7dfa7f054" containerName="extract" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.219344 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.221256 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-m5s9w" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.221826 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.234176 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d"] Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.235238 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.299336 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xszmw\" (UniqueName: \"kubernetes.io/projected/ca151bec-53be-465f-a65d-7cd62254e50f-kube-api-access-xszmw\") pod \"nmstate-operator-5d6f6cfd66-t979d\" (UID: \"ca151bec-53be-465f-a65d-7cd62254e50f\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.400506 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xszmw\" (UniqueName: \"kubernetes.io/projected/ca151bec-53be-465f-a65d-7cd62254e50f-kube-api-access-xszmw\") pod \"nmstate-operator-5d6f6cfd66-t979d\" (UID: \"ca151bec-53be-465f-a65d-7cd62254e50f\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.417329 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xszmw\" (UniqueName: \"kubernetes.io/projected/ca151bec-53be-465f-a65d-7cd62254e50f-kube-api-access-xszmw\") pod \"nmstate-operator-5d6f6cfd66-t979d\" (UID: \"ca151bec-53be-465f-a65d-7cd62254e50f\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.535092 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d" Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.723683 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d"] Oct 01 16:13:06 crc kubenswrapper[4764]: W1001 16:13:06.740285 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca151bec_53be_465f_a65d_7cd62254e50f.slice/crio-56b7a34ca5766a3a0e0c96a94f826b0a5749512a645c694c4c708ee6f429c740 WatchSource:0}: Error finding container 56b7a34ca5766a3a0e0c96a94f826b0a5749512a645c694c4c708ee6f429c740: Status 404 returned error can't find the container with id 56b7a34ca5766a3a0e0c96a94f826b0a5749512a645c694c4c708ee6f429c740 Oct 01 16:13:06 crc kubenswrapper[4764]: I1001 16:13:06.782186 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d" event={"ID":"ca151bec-53be-465f-a65d-7cd62254e50f","Type":"ContainerStarted","Data":"56b7a34ca5766a3a0e0c96a94f826b0a5749512a645c694c4c708ee6f429c740"} Oct 01 16:13:09 crc kubenswrapper[4764]: I1001 16:13:09.803807 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d" event={"ID":"ca151bec-53be-465f-a65d-7cd62254e50f","Type":"ContainerStarted","Data":"9231daaac6e35c9d0d1d9db442a481d038778a91823211ee6b01640e141ccf51"} Oct 01 16:13:09 crc kubenswrapper[4764]: I1001 16:13:09.819438 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d" podStartSLOduration=1.561305835 podStartE2EDuration="3.819417357s" podCreationTimestamp="2025-10-01 16:13:06 +0000 UTC" firstStartedPulling="2025-10-01 16:13:06.741814395 +0000 UTC m=+649.741461220" lastFinishedPulling="2025-10-01 16:13:08.999925907 +0000 UTC m=+651.999572742" observedRunningTime="2025-10-01 16:13:09.818563095 +0000 UTC m=+652.818209960" watchObservedRunningTime="2025-10-01 16:13:09.819417357 +0000 UTC m=+652.819064222" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.800205 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj"] Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.801689 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.803768 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-zqfx6" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.811146 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj"] Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.821637 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj"] Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.822262 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.828827 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.833077 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-vkk2s"] Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.833741 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.866762 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-8r7dj\" (UID: \"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.866827 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-nmstate-lock\") pod \"nmstate-handler-vkk2s\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.866848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2d8g\" (UniqueName: \"kubernetes.io/projected/8e5be6aa-6ed0-460a-81a8-1681016c3542-kube-api-access-z2d8g\") pod \"nmstate-handler-vkk2s\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.866871 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qln6x\" (UniqueName: \"kubernetes.io/projected/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-kube-api-access-qln6x\") pod \"nmstate-webhook-6d689559c5-8r7dj\" (UID: \"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.866926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-ovs-socket\") pod \"nmstate-handler-vkk2s\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.866946 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dnjf\" (UniqueName: \"kubernetes.io/projected/77091922-9776-42fc-af29-381f72eb28c6-kube-api-access-2dnjf\") pod \"nmstate-metrics-58fcddf996-ljlwj\" (UID: \"77091922-9776-42fc-af29-381f72eb28c6\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.866962 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-dbus-socket\") pod \"nmstate-handler-vkk2s\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.881398 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj"] Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.932037 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n"] Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.932951 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.936214 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-zxbg8" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.936372 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.936514 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.945662 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n"] Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.984321 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dnjf\" (UniqueName: \"kubernetes.io/projected/77091922-9776-42fc-af29-381f72eb28c6-kube-api-access-2dnjf\") pod \"nmstate-metrics-58fcddf996-ljlwj\" (UID: \"77091922-9776-42fc-af29-381f72eb28c6\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.984363 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-dbus-socket\") pod \"nmstate-handler-vkk2s\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.984390 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptp5c\" (UniqueName: \"kubernetes.io/projected/d4957cda-9179-4505-ac94-adaf24c337e8-kube-api-access-ptp5c\") pod \"nmstate-console-plugin-864bb6dfb5-f948n\" (UID: \"d4957cda-9179-4505-ac94-adaf24c337e8\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.984419 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d4957cda-9179-4505-ac94-adaf24c337e8-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-f948n\" (UID: \"d4957cda-9179-4505-ac94-adaf24c337e8\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.984447 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-8r7dj\" (UID: \"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.984483 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-nmstate-lock\") pod \"nmstate-handler-vkk2s\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.984505 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2d8g\" (UniqueName: \"kubernetes.io/projected/8e5be6aa-6ed0-460a-81a8-1681016c3542-kube-api-access-z2d8g\") pod \"nmstate-handler-vkk2s\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.984536 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qln6x\" (UniqueName: \"kubernetes.io/projected/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-kube-api-access-qln6x\") pod \"nmstate-webhook-6d689559c5-8r7dj\" (UID: \"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.984561 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4957cda-9179-4505-ac94-adaf24c337e8-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-f948n\" (UID: \"d4957cda-9179-4505-ac94-adaf24c337e8\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.984590 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-ovs-socket\") pod \"nmstate-handler-vkk2s\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.984661 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-ovs-socket\") pod \"nmstate-handler-vkk2s\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.985185 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-dbus-socket\") pod \"nmstate-handler-vkk2s\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:10 crc kubenswrapper[4764]: E1001 16:13:10.985264 4764 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 01 16:13:10 crc kubenswrapper[4764]: E1001 16:13:10.985305 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-tls-key-pair podName:2ef4a5fe-56b8-4b85-b1e6-831f98dd880b nodeName:}" failed. No retries permitted until 2025-10-01 16:13:11.485290914 +0000 UTC m=+654.484937749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-tls-key-pair") pod "nmstate-webhook-6d689559c5-8r7dj" (UID: "2ef4a5fe-56b8-4b85-b1e6-831f98dd880b") : secret "openshift-nmstate-webhook" not found Oct 01 16:13:10 crc kubenswrapper[4764]: I1001 16:13:10.985439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-nmstate-lock\") pod \"nmstate-handler-vkk2s\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.006230 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dnjf\" (UniqueName: \"kubernetes.io/projected/77091922-9776-42fc-af29-381f72eb28c6-kube-api-access-2dnjf\") pod \"nmstate-metrics-58fcddf996-ljlwj\" (UID: \"77091922-9776-42fc-af29-381f72eb28c6\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.019025 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qln6x\" (UniqueName: \"kubernetes.io/projected/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-kube-api-access-qln6x\") pod \"nmstate-webhook-6d689559c5-8r7dj\" (UID: \"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.023593 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2d8g\" (UniqueName: \"kubernetes.io/projected/8e5be6aa-6ed0-460a-81a8-1681016c3542-kube-api-access-z2d8g\") pod \"nmstate-handler-vkk2s\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.085669 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4957cda-9179-4505-ac94-adaf24c337e8-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-f948n\" (UID: \"d4957cda-9179-4505-ac94-adaf24c337e8\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.085756 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptp5c\" (UniqueName: \"kubernetes.io/projected/d4957cda-9179-4505-ac94-adaf24c337e8-kube-api-access-ptp5c\") pod \"nmstate-console-plugin-864bb6dfb5-f948n\" (UID: \"d4957cda-9179-4505-ac94-adaf24c337e8\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.085785 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d4957cda-9179-4505-ac94-adaf24c337e8-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-f948n\" (UID: \"d4957cda-9179-4505-ac94-adaf24c337e8\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:13:11 crc kubenswrapper[4764]: E1001 16:13:11.085907 4764 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 01 16:13:11 crc kubenswrapper[4764]: E1001 16:13:11.085999 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4957cda-9179-4505-ac94-adaf24c337e8-plugin-serving-cert podName:d4957cda-9179-4505-ac94-adaf24c337e8 nodeName:}" failed. No retries permitted until 2025-10-01 16:13:11.58597616 +0000 UTC m=+654.585623075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/d4957cda-9179-4505-ac94-adaf24c337e8-plugin-serving-cert") pod "nmstate-console-plugin-864bb6dfb5-f948n" (UID: "d4957cda-9179-4505-ac94-adaf24c337e8") : secret "plugin-serving-cert" not found Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.086593 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d4957cda-9179-4505-ac94-adaf24c337e8-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-f948n\" (UID: \"d4957cda-9179-4505-ac94-adaf24c337e8\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.101298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptp5c\" (UniqueName: \"kubernetes.io/projected/d4957cda-9179-4505-ac94-adaf24c337e8-kube-api-access-ptp5c\") pod \"nmstate-console-plugin-864bb6dfb5-f948n\" (UID: \"d4957cda-9179-4505-ac94-adaf24c337e8\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.109560 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d9bf7d847-bj9bj"] Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.110374 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.119316 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d9bf7d847-bj9bj"] Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.123299 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.167480 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.186760 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2267c6d3-7680-4332-bf2f-b4efebe6b76f-trusted-ca-bundle\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.186804 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2267c6d3-7680-4332-bf2f-b4efebe6b76f-console-oauth-config\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.186827 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgb5z\" (UniqueName: \"kubernetes.io/projected/2267c6d3-7680-4332-bf2f-b4efebe6b76f-kube-api-access-fgb5z\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.186867 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2267c6d3-7680-4332-bf2f-b4efebe6b76f-console-serving-cert\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.186893 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2267c6d3-7680-4332-bf2f-b4efebe6b76f-oauth-serving-cert\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.186912 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2267c6d3-7680-4332-bf2f-b4efebe6b76f-service-ca\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.186927 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2267c6d3-7680-4332-bf2f-b4efebe6b76f-console-config\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.288597 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2267c6d3-7680-4332-bf2f-b4efebe6b76f-console-oauth-config\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.288954 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgb5z\" (UniqueName: \"kubernetes.io/projected/2267c6d3-7680-4332-bf2f-b4efebe6b76f-kube-api-access-fgb5z\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.289000 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2267c6d3-7680-4332-bf2f-b4efebe6b76f-console-serving-cert\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.289024 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2267c6d3-7680-4332-bf2f-b4efebe6b76f-oauth-serving-cert\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.289060 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2267c6d3-7680-4332-bf2f-b4efebe6b76f-service-ca\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.289086 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2267c6d3-7680-4332-bf2f-b4efebe6b76f-console-config\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.289125 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2267c6d3-7680-4332-bf2f-b4efebe6b76f-trusted-ca-bundle\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.290140 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2267c6d3-7680-4332-bf2f-b4efebe6b76f-service-ca\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.290632 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2267c6d3-7680-4332-bf2f-b4efebe6b76f-trusted-ca-bundle\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.290654 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2267c6d3-7680-4332-bf2f-b4efebe6b76f-oauth-serving-cert\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.290775 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2267c6d3-7680-4332-bf2f-b4efebe6b76f-console-config\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.292719 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2267c6d3-7680-4332-bf2f-b4efebe6b76f-console-serving-cert\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.300392 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2267c6d3-7680-4332-bf2f-b4efebe6b76f-console-oauth-config\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.300513 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj"] Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.308475 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgb5z\" (UniqueName: \"kubernetes.io/projected/2267c6d3-7680-4332-bf2f-b4efebe6b76f-kube-api-access-fgb5z\") pod \"console-5d9bf7d847-bj9bj\" (UID: \"2267c6d3-7680-4332-bf2f-b4efebe6b76f\") " pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.426873 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.490989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-8r7dj\" (UID: \"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.495390 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-8r7dj\" (UID: \"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.591777 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4957cda-9179-4505-ac94-adaf24c337e8-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-f948n\" (UID: \"d4957cda-9179-4505-ac94-adaf24c337e8\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.595189 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4957cda-9179-4505-ac94-adaf24c337e8-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-f948n\" (UID: \"d4957cda-9179-4505-ac94-adaf24c337e8\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.631311 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d9bf7d847-bj9bj"] Oct 01 16:13:11 crc kubenswrapper[4764]: W1001 16:13:11.632987 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2267c6d3_7680_4332_bf2f_b4efebe6b76f.slice/crio-92af775cc6c4ef6f86686eab194fe25f46c42d0981ae174cb915c744753fe53b WatchSource:0}: Error finding container 92af775cc6c4ef6f86686eab194fe25f46c42d0981ae174cb915c744753fe53b: Status 404 returned error can't find the container with id 92af775cc6c4ef6f86686eab194fe25f46c42d0981ae174cb915c744753fe53b Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.651709 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.742568 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.824274 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n"] Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.824646 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9bf7d847-bj9bj" event={"ID":"2267c6d3-7680-4332-bf2f-b4efebe6b76f","Type":"ContainerStarted","Data":"d9fc818855e4d7b772ffa79e680a78fa36f775b18790ae2f7b7977c50e613867"} Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.824684 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9bf7d847-bj9bj" event={"ID":"2267c6d3-7680-4332-bf2f-b4efebe6b76f","Type":"ContainerStarted","Data":"92af775cc6c4ef6f86686eab194fe25f46c42d0981ae174cb915c744753fe53b"} Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.847755 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" event={"ID":"77091922-9776-42fc-af29-381f72eb28c6","Type":"ContainerStarted","Data":"b2bbbf432302ec59fc84b5aa0ff9da03fd9243aebcd12cc851aeaac0a801d081"} Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.850489 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vkk2s" event={"ID":"8e5be6aa-6ed0-460a-81a8-1681016c3542","Type":"ContainerStarted","Data":"7f8a146c2ea6127875161e5ff6e9621fba9102d2fa95b827db62ba1a1ec5e2ef"} Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.855091 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d9bf7d847-bj9bj" podStartSLOduration=0.85507333 podStartE2EDuration="855.07333ms" podCreationTimestamp="2025-10-01 16:13:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:13:11.854436154 +0000 UTC m=+654.854083019" watchObservedRunningTime="2025-10-01 16:13:11.85507333 +0000 UTC m=+654.854720165" Oct 01 16:13:11 crc kubenswrapper[4764]: I1001 16:13:11.925526 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj"] Oct 01 16:13:12 crc kubenswrapper[4764]: I1001 16:13:12.859717 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" event={"ID":"d4957cda-9179-4505-ac94-adaf24c337e8","Type":"ContainerStarted","Data":"945a117d3e967cb799cf7f2815cb42a512407ab0f236bf6c89d83bf3e0bdc511"} Oct 01 16:13:12 crc kubenswrapper[4764]: I1001 16:13:12.860996 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" event={"ID":"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b","Type":"ContainerStarted","Data":"fd5c87ffe45d8d9fc9a2a8cbfcc7ef9fbe947ad81ec5fbad759ddc6e2356d8b5"} Oct 01 16:13:14 crc kubenswrapper[4764]: I1001 16:13:14.875395 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vkk2s" event={"ID":"8e5be6aa-6ed0-460a-81a8-1681016c3542","Type":"ContainerStarted","Data":"aa9d4ac6d0cf9649f93c4fac36f8b7b6a8c6317b78b6c666d9b25cf0c8c87244"} Oct 01 16:13:14 crc kubenswrapper[4764]: I1001 16:13:14.876077 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:14 crc kubenswrapper[4764]: I1001 16:13:14.877784 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" event={"ID":"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b","Type":"ContainerStarted","Data":"8a51083d03ababbcfc0ad24c5bb6a5b18eed0300b217cc8b02e91ff1c2f5ae3d"} Oct 01 16:13:14 crc kubenswrapper[4764]: I1001 16:13:14.877824 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" Oct 01 16:13:14 crc kubenswrapper[4764]: I1001 16:13:14.879506 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" event={"ID":"77091922-9776-42fc-af29-381f72eb28c6","Type":"ContainerStarted","Data":"826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3"} Oct 01 16:13:14 crc kubenswrapper[4764]: I1001 16:13:14.894820 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-vkk2s" podStartSLOduration=2.015857786 podStartE2EDuration="4.894804519s" podCreationTimestamp="2025-10-01 16:13:10 +0000 UTC" firstStartedPulling="2025-10-01 16:13:11.184903693 +0000 UTC m=+654.184550528" lastFinishedPulling="2025-10-01 16:13:14.063850406 +0000 UTC m=+657.063497261" observedRunningTime="2025-10-01 16:13:14.892491302 +0000 UTC m=+657.892138137" watchObservedRunningTime="2025-10-01 16:13:14.894804519 +0000 UTC m=+657.894451354" Oct 01 16:13:15 crc kubenswrapper[4764]: I1001 16:13:15.892911 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" event={"ID":"d4957cda-9179-4505-ac94-adaf24c337e8","Type":"ContainerStarted","Data":"ab5f25d27a4b6e43584b8e67bcf899b03661bf6fe95740def633d94f2ccab25d"} Oct 01 16:13:15 crc kubenswrapper[4764]: I1001 16:13:15.907167 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" podStartSLOduration=2.86214405 podStartE2EDuration="5.90715238s" podCreationTimestamp="2025-10-01 16:13:10 +0000 UTC" firstStartedPulling="2025-10-01 16:13:11.854101866 +0000 UTC m=+654.853748701" lastFinishedPulling="2025-10-01 16:13:14.899110186 +0000 UTC m=+657.898757031" observedRunningTime="2025-10-01 16:13:15.906950584 +0000 UTC m=+658.906597419" watchObservedRunningTime="2025-10-01 16:13:15.90715238 +0000 UTC m=+658.906799215" Oct 01 16:13:15 crc kubenswrapper[4764]: I1001 16:13:15.913467 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" podStartSLOduration=3.7881065080000003 podStartE2EDuration="5.913448776s" podCreationTimestamp="2025-10-01 16:13:10 +0000 UTC" firstStartedPulling="2025-10-01 16:13:11.937803381 +0000 UTC m=+654.937450216" lastFinishedPulling="2025-10-01 16:13:14.063145649 +0000 UTC m=+657.062792484" observedRunningTime="2025-10-01 16:13:14.910331044 +0000 UTC m=+657.909977879" watchObservedRunningTime="2025-10-01 16:13:15.913448776 +0000 UTC m=+658.913095631" Oct 01 16:13:16 crc kubenswrapper[4764]: I1001 16:13:16.901119 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" event={"ID":"77091922-9776-42fc-af29-381f72eb28c6","Type":"ContainerStarted","Data":"512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178"} Oct 01 16:13:16 crc kubenswrapper[4764]: I1001 16:13:16.926296 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" podStartSLOduration=1.508306362 podStartE2EDuration="6.926272538s" podCreationTimestamp="2025-10-01 16:13:10 +0000 UTC" firstStartedPulling="2025-10-01 16:13:11.306288233 +0000 UTC m=+654.305935068" lastFinishedPulling="2025-10-01 16:13:16.724254399 +0000 UTC m=+659.723901244" observedRunningTime="2025-10-01 16:13:16.92191913 +0000 UTC m=+659.921566015" watchObservedRunningTime="2025-10-01 16:13:16.926272538 +0000 UTC m=+659.925919383" Oct 01 16:13:21 crc kubenswrapper[4764]: I1001 16:13:21.197859 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:13:21 crc kubenswrapper[4764]: I1001 16:13:21.428005 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:21 crc kubenswrapper[4764]: I1001 16:13:21.428422 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:21 crc kubenswrapper[4764]: I1001 16:13:21.436778 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:21 crc kubenswrapper[4764]: I1001 16:13:21.938099 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d9bf7d847-bj9bj" Oct 01 16:13:21 crc kubenswrapper[4764]: I1001 16:13:21.983725 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pfzm8"] Oct 01 16:13:31 crc kubenswrapper[4764]: I1001 16:13:31.752602 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.030246 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-pfzm8" podUID="35ad23c6-6d86-4e4f-b642-336f47fe999c" containerName="console" containerID="cri-o://b2e6c9d7bca102439f0a3bd44df64ea7e08c70902616d3befcbd823b7a3ad55a" gracePeriod=15 Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.305762 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9"] Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.307109 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.310301 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.317882 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9"] Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.395380 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pfzm8_35ad23c6-6d86-4e4f-b642-336f47fe999c/console/0.log" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.395455 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.436001 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6db57107-7b8f-48e1-8887-32516632caf8-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9\" (UID: \"6db57107-7b8f-48e1-8887-32516632caf8\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.436073 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6db57107-7b8f-48e1-8887-32516632caf8-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9\" (UID: \"6db57107-7b8f-48e1-8887-32516632caf8\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.436141 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxmpk\" (UniqueName: \"kubernetes.io/projected/6db57107-7b8f-48e1-8887-32516632caf8-kube-api-access-xxmpk\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9\" (UID: \"6db57107-7b8f-48e1-8887-32516632caf8\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.537488 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-oauth-config\") pod \"35ad23c6-6d86-4e4f-b642-336f47fe999c\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.537667 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-oauth-serving-cert\") pod \"35ad23c6-6d86-4e4f-b642-336f47fe999c\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.537723 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-serving-cert\") pod \"35ad23c6-6d86-4e4f-b642-336f47fe999c\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.537805 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccbwc\" (UniqueName: \"kubernetes.io/projected/35ad23c6-6d86-4e4f-b642-336f47fe999c-kube-api-access-ccbwc\") pod \"35ad23c6-6d86-4e4f-b642-336f47fe999c\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.537882 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-config\") pod \"35ad23c6-6d86-4e4f-b642-336f47fe999c\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.537941 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-trusted-ca-bundle\") pod \"35ad23c6-6d86-4e4f-b642-336f47fe999c\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.538006 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-service-ca\") pod \"35ad23c6-6d86-4e4f-b642-336f47fe999c\" (UID: \"35ad23c6-6d86-4e4f-b642-336f47fe999c\") " Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.538361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxmpk\" (UniqueName: \"kubernetes.io/projected/6db57107-7b8f-48e1-8887-32516632caf8-kube-api-access-xxmpk\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9\" (UID: \"6db57107-7b8f-48e1-8887-32516632caf8\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.539129 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6db57107-7b8f-48e1-8887-32516632caf8-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9\" (UID: \"6db57107-7b8f-48e1-8887-32516632caf8\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.539260 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "35ad23c6-6d86-4e4f-b642-336f47fe999c" (UID: "35ad23c6-6d86-4e4f-b642-336f47fe999c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.539370 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "35ad23c6-6d86-4e4f-b642-336f47fe999c" (UID: "35ad23c6-6d86-4e4f-b642-336f47fe999c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.539418 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-service-ca" (OuterVolumeSpecName: "service-ca") pod "35ad23c6-6d86-4e4f-b642-336f47fe999c" (UID: "35ad23c6-6d86-4e4f-b642-336f47fe999c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.539247 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-config" (OuterVolumeSpecName: "console-config") pod "35ad23c6-6d86-4e4f-b642-336f47fe999c" (UID: "35ad23c6-6d86-4e4f-b642-336f47fe999c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.539631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6db57107-7b8f-48e1-8887-32516632caf8-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9\" (UID: \"6db57107-7b8f-48e1-8887-32516632caf8\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.539729 4764 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.539760 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.540452 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6db57107-7b8f-48e1-8887-32516632caf8-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9\" (UID: \"6db57107-7b8f-48e1-8887-32516632caf8\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.540495 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6db57107-7b8f-48e1-8887-32516632caf8-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9\" (UID: \"6db57107-7b8f-48e1-8887-32516632caf8\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.544110 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "35ad23c6-6d86-4e4f-b642-336f47fe999c" (UID: "35ad23c6-6d86-4e4f-b642-336f47fe999c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.544409 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "35ad23c6-6d86-4e4f-b642-336f47fe999c" (UID: "35ad23c6-6d86-4e4f-b642-336f47fe999c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.545466 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35ad23c6-6d86-4e4f-b642-336f47fe999c-kube-api-access-ccbwc" (OuterVolumeSpecName: "kube-api-access-ccbwc") pod "35ad23c6-6d86-4e4f-b642-336f47fe999c" (UID: "35ad23c6-6d86-4e4f-b642-336f47fe999c"). InnerVolumeSpecName "kube-api-access-ccbwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.570299 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxmpk\" (UniqueName: \"kubernetes.io/projected/6db57107-7b8f-48e1-8887-32516632caf8-kube-api-access-xxmpk\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9\" (UID: \"6db57107-7b8f-48e1-8887-32516632caf8\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.631393 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.641467 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.641512 4764 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.641529 4764 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/35ad23c6-6d86-4e4f-b642-336f47fe999c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.641548 4764 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/35ad23c6-6d86-4e4f-b642-336f47fe999c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.641563 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccbwc\" (UniqueName: \"kubernetes.io/projected/35ad23c6-6d86-4e4f-b642-336f47fe999c-kube-api-access-ccbwc\") on node \"crc\" DevicePath \"\"" Oct 01 16:13:47 crc kubenswrapper[4764]: I1001 16:13:47.856641 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9"] Oct 01 16:13:48 crc kubenswrapper[4764]: I1001 16:13:48.142734 4764 generic.go:334] "Generic (PLEG): container finished" podID="6db57107-7b8f-48e1-8887-32516632caf8" containerID="b352323a7c5dbef7344c968d1e437b07efa3532f690500ac00b0fb849735ad95" exitCode=0 Oct 01 16:13:48 crc kubenswrapper[4764]: I1001 16:13:48.142841 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" event={"ID":"6db57107-7b8f-48e1-8887-32516632caf8","Type":"ContainerDied","Data":"b352323a7c5dbef7344c968d1e437b07efa3532f690500ac00b0fb849735ad95"} Oct 01 16:13:48 crc kubenswrapper[4764]: I1001 16:13:48.142889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" event={"ID":"6db57107-7b8f-48e1-8887-32516632caf8","Type":"ContainerStarted","Data":"ad110aa3e423b1c585605e60c0350052fba103a1730922ede6785132c8d6e88c"} Oct 01 16:13:48 crc kubenswrapper[4764]: I1001 16:13:48.145326 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pfzm8_35ad23c6-6d86-4e4f-b642-336f47fe999c/console/0.log" Oct 01 16:13:48 crc kubenswrapper[4764]: I1001 16:13:48.145383 4764 generic.go:334] "Generic (PLEG): container finished" podID="35ad23c6-6d86-4e4f-b642-336f47fe999c" containerID="b2e6c9d7bca102439f0a3bd44df64ea7e08c70902616d3befcbd823b7a3ad55a" exitCode=2 Oct 01 16:13:48 crc kubenswrapper[4764]: I1001 16:13:48.145414 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pfzm8" event={"ID":"35ad23c6-6d86-4e4f-b642-336f47fe999c","Type":"ContainerDied","Data":"b2e6c9d7bca102439f0a3bd44df64ea7e08c70902616d3befcbd823b7a3ad55a"} Oct 01 16:13:48 crc kubenswrapper[4764]: I1001 16:13:48.145455 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pfzm8" event={"ID":"35ad23c6-6d86-4e4f-b642-336f47fe999c","Type":"ContainerDied","Data":"cbe3f3eb334fefead5e7a40cb6e06c52cd12a66c3e36343b0362fe84a3deac4d"} Oct 01 16:13:48 crc kubenswrapper[4764]: I1001 16:13:48.145484 4764 scope.go:117] "RemoveContainer" containerID="b2e6c9d7bca102439f0a3bd44df64ea7e08c70902616d3befcbd823b7a3ad55a" Oct 01 16:13:48 crc kubenswrapper[4764]: I1001 16:13:48.145508 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pfzm8" Oct 01 16:13:48 crc kubenswrapper[4764]: I1001 16:13:48.220788 4764 scope.go:117] "RemoveContainer" containerID="b2e6c9d7bca102439f0a3bd44df64ea7e08c70902616d3befcbd823b7a3ad55a" Oct 01 16:13:48 crc kubenswrapper[4764]: E1001 16:13:48.221630 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e6c9d7bca102439f0a3bd44df64ea7e08c70902616d3befcbd823b7a3ad55a\": container with ID starting with b2e6c9d7bca102439f0a3bd44df64ea7e08c70902616d3befcbd823b7a3ad55a not found: ID does not exist" containerID="b2e6c9d7bca102439f0a3bd44df64ea7e08c70902616d3befcbd823b7a3ad55a" Oct 01 16:13:48 crc kubenswrapper[4764]: I1001 16:13:48.221708 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e6c9d7bca102439f0a3bd44df64ea7e08c70902616d3befcbd823b7a3ad55a"} err="failed to get container status \"b2e6c9d7bca102439f0a3bd44df64ea7e08c70902616d3befcbd823b7a3ad55a\": rpc error: code = NotFound desc = could not find container \"b2e6c9d7bca102439f0a3bd44df64ea7e08c70902616d3befcbd823b7a3ad55a\": container with ID starting with b2e6c9d7bca102439f0a3bd44df64ea7e08c70902616d3befcbd823b7a3ad55a not found: ID does not exist" Oct 01 16:13:48 crc kubenswrapper[4764]: I1001 16:13:48.231192 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pfzm8"] Oct 01 16:13:48 crc kubenswrapper[4764]: I1001 16:13:48.243130 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-pfzm8"] Oct 01 16:13:49 crc kubenswrapper[4764]: I1001 16:13:49.736501 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35ad23c6-6d86-4e4f-b642-336f47fe999c" path="/var/lib/kubelet/pods/35ad23c6-6d86-4e4f-b642-336f47fe999c/volumes" Oct 01 16:13:50 crc kubenswrapper[4764]: I1001 16:13:50.167407 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" event={"ID":"6db57107-7b8f-48e1-8887-32516632caf8","Type":"ContainerStarted","Data":"2470902e692989de392f45a52aaf349e584901ad57c796b3743401f90bf008c2"} Oct 01 16:13:51 crc kubenswrapper[4764]: I1001 16:13:51.176382 4764 generic.go:334] "Generic (PLEG): container finished" podID="6db57107-7b8f-48e1-8887-32516632caf8" containerID="2470902e692989de392f45a52aaf349e584901ad57c796b3743401f90bf008c2" exitCode=0 Oct 01 16:13:51 crc kubenswrapper[4764]: I1001 16:13:51.176517 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" event={"ID":"6db57107-7b8f-48e1-8887-32516632caf8","Type":"ContainerDied","Data":"2470902e692989de392f45a52aaf349e584901ad57c796b3743401f90bf008c2"} Oct 01 16:13:52 crc kubenswrapper[4764]: I1001 16:13:52.187923 4764 generic.go:334] "Generic (PLEG): container finished" podID="6db57107-7b8f-48e1-8887-32516632caf8" containerID="f44d20bab515d55c78512bbebc58f731254f2192609b725d996ff9f8d6206bac" exitCode=0 Oct 01 16:13:52 crc kubenswrapper[4764]: I1001 16:13:52.187980 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" event={"ID":"6db57107-7b8f-48e1-8887-32516632caf8","Type":"ContainerDied","Data":"f44d20bab515d55c78512bbebc58f731254f2192609b725d996ff9f8d6206bac"} Oct 01 16:13:53 crc kubenswrapper[4764]: I1001 16:13:53.413023 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" Oct 01 16:13:53 crc kubenswrapper[4764]: I1001 16:13:53.533141 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxmpk\" (UniqueName: \"kubernetes.io/projected/6db57107-7b8f-48e1-8887-32516632caf8-kube-api-access-xxmpk\") pod \"6db57107-7b8f-48e1-8887-32516632caf8\" (UID: \"6db57107-7b8f-48e1-8887-32516632caf8\") " Oct 01 16:13:53 crc kubenswrapper[4764]: I1001 16:13:53.533323 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6db57107-7b8f-48e1-8887-32516632caf8-bundle\") pod \"6db57107-7b8f-48e1-8887-32516632caf8\" (UID: \"6db57107-7b8f-48e1-8887-32516632caf8\") " Oct 01 16:13:53 crc kubenswrapper[4764]: I1001 16:13:53.533419 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6db57107-7b8f-48e1-8887-32516632caf8-util\") pod \"6db57107-7b8f-48e1-8887-32516632caf8\" (UID: \"6db57107-7b8f-48e1-8887-32516632caf8\") " Oct 01 16:13:53 crc kubenswrapper[4764]: I1001 16:13:53.535446 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db57107-7b8f-48e1-8887-32516632caf8-bundle" (OuterVolumeSpecName: "bundle") pod "6db57107-7b8f-48e1-8887-32516632caf8" (UID: "6db57107-7b8f-48e1-8887-32516632caf8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:13:53 crc kubenswrapper[4764]: I1001 16:13:53.541126 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6db57107-7b8f-48e1-8887-32516632caf8-kube-api-access-xxmpk" (OuterVolumeSpecName: "kube-api-access-xxmpk") pod "6db57107-7b8f-48e1-8887-32516632caf8" (UID: "6db57107-7b8f-48e1-8887-32516632caf8"). InnerVolumeSpecName "kube-api-access-xxmpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:13:53 crc kubenswrapper[4764]: I1001 16:13:53.608832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db57107-7b8f-48e1-8887-32516632caf8-util" (OuterVolumeSpecName: "util") pod "6db57107-7b8f-48e1-8887-32516632caf8" (UID: "6db57107-7b8f-48e1-8887-32516632caf8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:13:53 crc kubenswrapper[4764]: I1001 16:13:53.635736 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6db57107-7b8f-48e1-8887-32516632caf8-util\") on node \"crc\" DevicePath \"\"" Oct 01 16:13:53 crc kubenswrapper[4764]: I1001 16:13:53.635791 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxmpk\" (UniqueName: \"kubernetes.io/projected/6db57107-7b8f-48e1-8887-32516632caf8-kube-api-access-xxmpk\") on node \"crc\" DevicePath \"\"" Oct 01 16:13:53 crc kubenswrapper[4764]: I1001 16:13:53.635814 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6db57107-7b8f-48e1-8887-32516632caf8-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:13:54 crc kubenswrapper[4764]: I1001 16:13:54.205103 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" event={"ID":"6db57107-7b8f-48e1-8887-32516632caf8","Type":"ContainerDied","Data":"ad110aa3e423b1c585605e60c0350052fba103a1730922ede6785132c8d6e88c"} Oct 01 16:13:54 crc kubenswrapper[4764]: I1001 16:13:54.205654 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad110aa3e423b1c585605e60c0350052fba103a1730922ede6785132c8d6e88c" Oct 01 16:13:54 crc kubenswrapper[4764]: I1001 16:13:54.205250 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.193810 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms"] Oct 01 16:14:02 crc kubenswrapper[4764]: E1001 16:14:02.194633 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35ad23c6-6d86-4e4f-b642-336f47fe999c" containerName="console" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.194648 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="35ad23c6-6d86-4e4f-b642-336f47fe999c" containerName="console" Oct 01 16:14:02 crc kubenswrapper[4764]: E1001 16:14:02.194669 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db57107-7b8f-48e1-8887-32516632caf8" containerName="util" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.194677 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db57107-7b8f-48e1-8887-32516632caf8" containerName="util" Oct 01 16:14:02 crc kubenswrapper[4764]: E1001 16:14:02.194695 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db57107-7b8f-48e1-8887-32516632caf8" containerName="pull" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.194705 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db57107-7b8f-48e1-8887-32516632caf8" containerName="pull" Oct 01 16:14:02 crc kubenswrapper[4764]: E1001 16:14:02.194714 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db57107-7b8f-48e1-8887-32516632caf8" containerName="extract" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.194721 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db57107-7b8f-48e1-8887-32516632caf8" containerName="extract" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.194841 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db57107-7b8f-48e1-8887-32516632caf8" containerName="extract" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.194854 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="35ad23c6-6d86-4e4f-b642-336f47fe999c" containerName="console" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.195324 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.197932 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.198585 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.198785 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-2fgfl" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.200578 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.206121 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms"] Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.210346 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.347221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a75ea530-e233-41d3-a607-07ad2b30c4f1-apiservice-cert\") pod \"metallb-operator-controller-manager-6694fb7ccb-888ms\" (UID: \"a75ea530-e233-41d3-a607-07ad2b30c4f1\") " pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.347288 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrsvq\" (UniqueName: \"kubernetes.io/projected/a75ea530-e233-41d3-a607-07ad2b30c4f1-kube-api-access-zrsvq\") pod \"metallb-operator-controller-manager-6694fb7ccb-888ms\" (UID: \"a75ea530-e233-41d3-a607-07ad2b30c4f1\") " pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.347311 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a75ea530-e233-41d3-a607-07ad2b30c4f1-webhook-cert\") pod \"metallb-operator-controller-manager-6694fb7ccb-888ms\" (UID: \"a75ea530-e233-41d3-a607-07ad2b30c4f1\") " pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.447438 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf"] Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.448054 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrsvq\" (UniqueName: \"kubernetes.io/projected/a75ea530-e233-41d3-a607-07ad2b30c4f1-kube-api-access-zrsvq\") pod \"metallb-operator-controller-manager-6694fb7ccb-888ms\" (UID: \"a75ea530-e233-41d3-a607-07ad2b30c4f1\") " pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.448104 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a75ea530-e233-41d3-a607-07ad2b30c4f1-webhook-cert\") pod \"metallb-operator-controller-manager-6694fb7ccb-888ms\" (UID: \"a75ea530-e233-41d3-a607-07ad2b30c4f1\") " pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.448169 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a75ea530-e233-41d3-a607-07ad2b30c4f1-apiservice-cert\") pod \"metallb-operator-controller-manager-6694fb7ccb-888ms\" (UID: \"a75ea530-e233-41d3-a607-07ad2b30c4f1\") " pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.448296 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.453638 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.453769 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a75ea530-e233-41d3-a607-07ad2b30c4f1-webhook-cert\") pod \"metallb-operator-controller-manager-6694fb7ccb-888ms\" (UID: \"a75ea530-e233-41d3-a607-07ad2b30c4f1\") " pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.453886 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.454330 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-66cx9" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.460120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a75ea530-e233-41d3-a607-07ad2b30c4f1-apiservice-cert\") pod \"metallb-operator-controller-manager-6694fb7ccb-888ms\" (UID: \"a75ea530-e233-41d3-a607-07ad2b30c4f1\") " pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.467605 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf"] Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.490703 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrsvq\" (UniqueName: \"kubernetes.io/projected/a75ea530-e233-41d3-a607-07ad2b30c4f1-kube-api-access-zrsvq\") pod \"metallb-operator-controller-manager-6694fb7ccb-888ms\" (UID: \"a75ea530-e233-41d3-a607-07ad2b30c4f1\") " pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.510884 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.548978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b985bc65-1311-4e4b-a037-f73bb4178c84-webhook-cert\") pod \"metallb-operator-webhook-server-6748f49456-mtpnf\" (UID: \"b985bc65-1311-4e4b-a037-f73bb4178c84\") " pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.549469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8n2r\" (UniqueName: \"kubernetes.io/projected/b985bc65-1311-4e4b-a037-f73bb4178c84-kube-api-access-x8n2r\") pod \"metallb-operator-webhook-server-6748f49456-mtpnf\" (UID: \"b985bc65-1311-4e4b-a037-f73bb4178c84\") " pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.549508 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b985bc65-1311-4e4b-a037-f73bb4178c84-apiservice-cert\") pod \"metallb-operator-webhook-server-6748f49456-mtpnf\" (UID: \"b985bc65-1311-4e4b-a037-f73bb4178c84\") " pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.652928 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b985bc65-1311-4e4b-a037-f73bb4178c84-webhook-cert\") pod \"metallb-operator-webhook-server-6748f49456-mtpnf\" (UID: \"b985bc65-1311-4e4b-a037-f73bb4178c84\") " pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.653022 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8n2r\" (UniqueName: \"kubernetes.io/projected/b985bc65-1311-4e4b-a037-f73bb4178c84-kube-api-access-x8n2r\") pod \"metallb-operator-webhook-server-6748f49456-mtpnf\" (UID: \"b985bc65-1311-4e4b-a037-f73bb4178c84\") " pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.653089 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b985bc65-1311-4e4b-a037-f73bb4178c84-apiservice-cert\") pod \"metallb-operator-webhook-server-6748f49456-mtpnf\" (UID: \"b985bc65-1311-4e4b-a037-f73bb4178c84\") " pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.659219 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b985bc65-1311-4e4b-a037-f73bb4178c84-apiservice-cert\") pod \"metallb-operator-webhook-server-6748f49456-mtpnf\" (UID: \"b985bc65-1311-4e4b-a037-f73bb4178c84\") " pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.662361 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b985bc65-1311-4e4b-a037-f73bb4178c84-webhook-cert\") pod \"metallb-operator-webhook-server-6748f49456-mtpnf\" (UID: \"b985bc65-1311-4e4b-a037-f73bb4178c84\") " pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.674115 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8n2r\" (UniqueName: \"kubernetes.io/projected/b985bc65-1311-4e4b-a037-f73bb4178c84-kube-api-access-x8n2r\") pod \"metallb-operator-webhook-server-6748f49456-mtpnf\" (UID: \"b985bc65-1311-4e4b-a037-f73bb4178c84\") " pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.774412 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms"] Oct 01 16:14:02 crc kubenswrapper[4764]: I1001 16:14:02.871162 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:14:03 crc kubenswrapper[4764]: I1001 16:14:03.254320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" event={"ID":"a75ea530-e233-41d3-a607-07ad2b30c4f1","Type":"ContainerStarted","Data":"edd3f3289bd44e4daa76dcbed8e719922a5aed416c837440722c0ec8da2effec"} Oct 01 16:14:03 crc kubenswrapper[4764]: I1001 16:14:03.278463 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf"] Oct 01 16:14:04 crc kubenswrapper[4764]: I1001 16:14:04.263302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" event={"ID":"b985bc65-1311-4e4b-a037-f73bb4178c84","Type":"ContainerStarted","Data":"eeee3fabfc152147b854f5209c87846a2dd4f3ffdb679d8f5598d7fe9ddb222e"} Oct 01 16:14:06 crc kubenswrapper[4764]: I1001 16:14:06.288400 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" event={"ID":"a75ea530-e233-41d3-a607-07ad2b30c4f1","Type":"ContainerStarted","Data":"06e2cc05fea9cf8f422ca7a0d6bdea761b874450a8277653ffdb0c345041d949"} Oct 01 16:14:06 crc kubenswrapper[4764]: I1001 16:14:06.288805 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:14:06 crc kubenswrapper[4764]: I1001 16:14:06.307379 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" podStartSLOduration=1.308191221 podStartE2EDuration="4.307357654s" podCreationTimestamp="2025-10-01 16:14:02 +0000 UTC" firstStartedPulling="2025-10-01 16:14:02.782016984 +0000 UTC m=+705.781663829" lastFinishedPulling="2025-10-01 16:14:05.781183427 +0000 UTC m=+708.780830262" observedRunningTime="2025-10-01 16:14:06.303797486 +0000 UTC m=+709.303444331" watchObservedRunningTime="2025-10-01 16:14:06.307357654 +0000 UTC m=+709.307004489" Oct 01 16:14:08 crc kubenswrapper[4764]: I1001 16:14:08.300549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" event={"ID":"b985bc65-1311-4e4b-a037-f73bb4178c84","Type":"ContainerStarted","Data":"6d94563ba329750d57d1b7f28cd92bff0b61c481aaf512332043cd4518d4940d"} Oct 01 16:14:08 crc kubenswrapper[4764]: I1001 16:14:08.300948 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:14:08 crc kubenswrapper[4764]: I1001 16:14:08.326434 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" podStartSLOduration=2.05028317 podStartE2EDuration="6.326412155s" podCreationTimestamp="2025-10-01 16:14:02 +0000 UTC" firstStartedPulling="2025-10-01 16:14:03.288219416 +0000 UTC m=+706.287866261" lastFinishedPulling="2025-10-01 16:14:07.564348401 +0000 UTC m=+710.563995246" observedRunningTime="2025-10-01 16:14:08.321370491 +0000 UTC m=+711.321017346" watchObservedRunningTime="2025-10-01 16:14:08.326412155 +0000 UTC m=+711.326059000" Oct 01 16:14:22 crc kubenswrapper[4764]: I1001 16:14:22.880487 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:14:42 crc kubenswrapper[4764]: I1001 16:14:42.514070 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.341596 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8"] Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.342759 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.345188 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.345195 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-bfp67" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.345927 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-6tr29"] Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.348449 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.350146 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.350431 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.360619 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8"] Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.421034 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2djw\" (UniqueName: \"kubernetes.io/projected/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-kube-api-access-t2djw\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.421403 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-startup\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.421427 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-reloader\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.421443 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-metrics\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.421458 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6764cb75-6b7f-45aa-a7b7-6347f50022f7-cert\") pod \"frr-k8s-webhook-server-5478bdb765-j2wj8\" (UID: \"6764cb75-6b7f-45aa-a7b7-6347f50022f7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.421483 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-sockets\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.421502 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-conf\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.421770 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trlhl\" (UniqueName: \"kubernetes.io/projected/6764cb75-6b7f-45aa-a7b7-6347f50022f7-kube-api-access-trlhl\") pod \"frr-k8s-webhook-server-5478bdb765-j2wj8\" (UID: \"6764cb75-6b7f-45aa-a7b7-6347f50022f7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.421833 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-metrics-certs\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.422284 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-dq7mc"] Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.423229 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dq7mc" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.425308 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.425732 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-48mpk" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.426037 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.426106 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.445206 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-f8m22"] Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.446123 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.447984 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.457811 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-f8m22"] Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.523294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-startup\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.523338 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-memberlist\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.523361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-reloader\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.523383 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-metrics\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.523425 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6764cb75-6b7f-45aa-a7b7-6347f50022f7-cert\") pod \"frr-k8s-webhook-server-5478bdb765-j2wj8\" (UID: \"6764cb75-6b7f-45aa-a7b7-6347f50022f7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.523497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-sockets\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.523515 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-conf\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.523922 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-metrics\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.523988 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-sockets\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.524010 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-reloader\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.524103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metallb-excludel2\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.524155 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-metrics-certs\") pod \"controller-5d688f5ffc-f8m22\" (UID: \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\") " pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.524168 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-conf\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.524176 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sxf9\" (UniqueName: \"kubernetes.io/projected/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-kube-api-access-7sxf9\") pod \"controller-5d688f5ffc-f8m22\" (UID: \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\") " pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.524246 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-cert\") pod \"controller-5d688f5ffc-f8m22\" (UID: \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\") " pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.524287 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trlhl\" (UniqueName: \"kubernetes.io/projected/6764cb75-6b7f-45aa-a7b7-6347f50022f7-kube-api-access-trlhl\") pod \"frr-k8s-webhook-server-5478bdb765-j2wj8\" (UID: \"6764cb75-6b7f-45aa-a7b7-6347f50022f7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.524320 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-metrics-certs\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.524380 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t79mc\" (UniqueName: \"kubernetes.io/projected/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-kube-api-access-t79mc\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.524397 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2djw\" (UniqueName: \"kubernetes.io/projected/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-kube-api-access-t2djw\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.524420 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-startup\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.524427 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metrics-certs\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.529633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-metrics-certs\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.546742 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trlhl\" (UniqueName: \"kubernetes.io/projected/6764cb75-6b7f-45aa-a7b7-6347f50022f7-kube-api-access-trlhl\") pod \"frr-k8s-webhook-server-5478bdb765-j2wj8\" (UID: \"6764cb75-6b7f-45aa-a7b7-6347f50022f7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.550604 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6764cb75-6b7f-45aa-a7b7-6347f50022f7-cert\") pod \"frr-k8s-webhook-server-5478bdb765-j2wj8\" (UID: \"6764cb75-6b7f-45aa-a7b7-6347f50022f7\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.561743 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2djw\" (UniqueName: \"kubernetes.io/projected/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-kube-api-access-t2djw\") pod \"frr-k8s-6tr29\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.625274 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metallb-excludel2\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.625327 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-metrics-certs\") pod \"controller-5d688f5ffc-f8m22\" (UID: \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\") " pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.625348 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sxf9\" (UniqueName: \"kubernetes.io/projected/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-kube-api-access-7sxf9\") pod \"controller-5d688f5ffc-f8m22\" (UID: \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\") " pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.625367 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-cert\") pod \"controller-5d688f5ffc-f8m22\" (UID: \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\") " pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.625400 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t79mc\" (UniqueName: \"kubernetes.io/projected/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-kube-api-access-t79mc\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.625421 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metrics-certs\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.625446 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-memberlist\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:43 crc kubenswrapper[4764]: E1001 16:14:43.625553 4764 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 16:14:43 crc kubenswrapper[4764]: E1001 16:14:43.625603 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-memberlist podName:5b219b21-c1fa-4af1-9d7a-ad0f00c987b3 nodeName:}" failed. No retries permitted until 2025-10-01 16:14:44.125587727 +0000 UTC m=+747.125234562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-memberlist") pod "speaker-dq7mc" (UID: "5b219b21-c1fa-4af1-9d7a-ad0f00c987b3") : secret "metallb-memberlist" not found Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.626439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metallb-excludel2\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:43 crc kubenswrapper[4764]: E1001 16:14:43.627033 4764 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 01 16:14:43 crc kubenswrapper[4764]: E1001 16:14:43.627084 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metrics-certs podName:5b219b21-c1fa-4af1-9d7a-ad0f00c987b3 nodeName:}" failed. No retries permitted until 2025-10-01 16:14:44.127074494 +0000 UTC m=+747.126721329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metrics-certs") pod "speaker-dq7mc" (UID: "5b219b21-c1fa-4af1-9d7a-ad0f00c987b3") : secret "speaker-certs-secret" not found Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.629833 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.629972 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-metrics-certs\") pod \"controller-5d688f5ffc-f8m22\" (UID: \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\") " pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.642376 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sxf9\" (UniqueName: \"kubernetes.io/projected/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-kube-api-access-7sxf9\") pod \"controller-5d688f5ffc-f8m22\" (UID: \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\") " pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.642642 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-cert\") pod \"controller-5d688f5ffc-f8m22\" (UID: \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\") " pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.643354 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t79mc\" (UniqueName: \"kubernetes.io/projected/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-kube-api-access-t79mc\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.669668 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.677568 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6tr29" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.761092 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:14:43 crc kubenswrapper[4764]: I1001 16:14:43.988963 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-f8m22"] Oct 01 16:14:43 crc kubenswrapper[4764]: W1001 16:14:43.994449 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90ebc18d_5d97_4850_8c9e_2c0d71fcea0f.slice/crio-254719c5c86c1510275c14c63a880ebf84815460f48a9518f2544f544fdedeff WatchSource:0}: Error finding container 254719c5c86c1510275c14c63a880ebf84815460f48a9518f2544f544fdedeff: Status 404 returned error can't find the container with id 254719c5c86c1510275c14c63a880ebf84815460f48a9518f2544f544fdedeff Oct 01 16:14:44 crc kubenswrapper[4764]: I1001 16:14:44.130647 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metrics-certs\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:44 crc kubenswrapper[4764]: I1001 16:14:44.130704 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-memberlist\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:44 crc kubenswrapper[4764]: E1001 16:14:44.130870 4764 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 16:14:44 crc kubenswrapper[4764]: E1001 16:14:44.130920 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-memberlist podName:5b219b21-c1fa-4af1-9d7a-ad0f00c987b3 nodeName:}" failed. No retries permitted until 2025-10-01 16:14:45.130905607 +0000 UTC m=+748.130552452 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-memberlist") pod "speaker-dq7mc" (UID: "5b219b21-c1fa-4af1-9d7a-ad0f00c987b3") : secret "metallb-memberlist" not found Oct 01 16:14:44 crc kubenswrapper[4764]: I1001 16:14:44.135764 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metrics-certs\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:44 crc kubenswrapper[4764]: I1001 16:14:44.166310 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8"] Oct 01 16:14:44 crc kubenswrapper[4764]: I1001 16:14:44.561825 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-f8m22" event={"ID":"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f","Type":"ContainerStarted","Data":"a15bb4ffa2041e6f15271ed5b94b680a1ef79cc20d7619d3f5d6ced6a91409e1"} Oct 01 16:14:44 crc kubenswrapper[4764]: I1001 16:14:44.561911 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-f8m22" event={"ID":"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f","Type":"ContainerStarted","Data":"254719c5c86c1510275c14c63a880ebf84815460f48a9518f2544f544fdedeff"} Oct 01 16:14:44 crc kubenswrapper[4764]: I1001 16:14:44.563445 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" event={"ID":"6764cb75-6b7f-45aa-a7b7-6347f50022f7","Type":"ContainerStarted","Data":"fcaa3c0d7590cc820ca8bf799421f6c149bcdccf5f2aa209c4d3b17b5289243b"} Oct 01 16:14:44 crc kubenswrapper[4764]: I1001 16:14:44.565424 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6tr29" event={"ID":"b922335a-fbf4-41ec-99b0-dafbd8b24bf5","Type":"ContainerStarted","Data":"8361eaa1921314acb5243d433a769f8740d53b92acb7ed47aaa81388286f8c5e"} Oct 01 16:14:45 crc kubenswrapper[4764]: I1001 16:14:45.145963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-memberlist\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:45 crc kubenswrapper[4764]: I1001 16:14:45.153289 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-memberlist\") pod \"speaker-dq7mc\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " pod="metallb-system/speaker-dq7mc" Oct 01 16:14:45 crc kubenswrapper[4764]: I1001 16:14:45.238276 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dq7mc" Oct 01 16:14:45 crc kubenswrapper[4764]: W1001 16:14:45.275339 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b219b21_c1fa_4af1_9d7a_ad0f00c987b3.slice/crio-bcd45988bb37d5c383e173e7d53d924d5d0dc03bf21c64e9911910abef936806 WatchSource:0}: Error finding container bcd45988bb37d5c383e173e7d53d924d5d0dc03bf21c64e9911910abef936806: Status 404 returned error can't find the container with id bcd45988bb37d5c383e173e7d53d924d5d0dc03bf21c64e9911910abef936806 Oct 01 16:14:45 crc kubenswrapper[4764]: I1001 16:14:45.576466 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-f8m22" event={"ID":"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f","Type":"ContainerStarted","Data":"e1f282d32435aeb8cdeea1239affa34936768bdc60c7d1d8e34946f10b152c13"} Oct 01 16:14:45 crc kubenswrapper[4764]: I1001 16:14:45.576543 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:14:45 crc kubenswrapper[4764]: I1001 16:14:45.578068 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dq7mc" event={"ID":"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3","Type":"ContainerStarted","Data":"bcd45988bb37d5c383e173e7d53d924d5d0dc03bf21c64e9911910abef936806"} Oct 01 16:14:45 crc kubenswrapper[4764]: I1001 16:14:45.603646 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-f8m22" podStartSLOduration=2.603630038 podStartE2EDuration="2.603630038s" podCreationTimestamp="2025-10-01 16:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:14:45.601187747 +0000 UTC m=+748.600834582" watchObservedRunningTime="2025-10-01 16:14:45.603630038 +0000 UTC m=+748.603276873" Oct 01 16:14:46 crc kubenswrapper[4764]: I1001 16:14:46.586241 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dq7mc" event={"ID":"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3","Type":"ContainerStarted","Data":"74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1"} Oct 01 16:14:46 crc kubenswrapper[4764]: I1001 16:14:46.586563 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-dq7mc" Oct 01 16:14:46 crc kubenswrapper[4764]: I1001 16:14:46.586583 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-dq7mc" event={"ID":"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3","Type":"ContainerStarted","Data":"97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a"} Oct 01 16:14:46 crc kubenswrapper[4764]: I1001 16:14:46.600676 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-dq7mc" podStartSLOduration=3.600661791 podStartE2EDuration="3.600661791s" podCreationTimestamp="2025-10-01 16:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:14:46.600516647 +0000 UTC m=+749.600163482" watchObservedRunningTime="2025-10-01 16:14:46.600661791 +0000 UTC m=+749.600308626" Oct 01 16:14:51 crc kubenswrapper[4764]: I1001 16:14:51.016713 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6j7l4"] Oct 01 16:14:51 crc kubenswrapper[4764]: I1001 16:14:51.017290 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" podUID="1dc737f5-37f4-47a3-8716-af033cbe27fc" containerName="controller-manager" containerID="cri-o://6bf3b8f25815634bcc2b6e429f92701176838961f57da4fcf167a89026396f0f" gracePeriod=30 Oct 01 16:14:51 crc kubenswrapper[4764]: I1001 16:14:51.082301 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc"] Oct 01 16:14:51 crc kubenswrapper[4764]: I1001 16:14:51.082496 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" podUID="aedccabf-870c-4bca-9d09-028aa2416702" containerName="route-controller-manager" containerID="cri-o://d7a75f11f4623f4255211cca3376549e5cdb9ab48ec5a134aa7014cc1e28103d" gracePeriod=30 Oct 01 16:14:53 crc kubenswrapper[4764]: I1001 16:14:53.654576 4764 generic.go:334] "Generic (PLEG): container finished" podID="1dc737f5-37f4-47a3-8716-af033cbe27fc" containerID="6bf3b8f25815634bcc2b6e429f92701176838961f57da4fcf167a89026396f0f" exitCode=0 Oct 01 16:14:53 crc kubenswrapper[4764]: I1001 16:14:53.654719 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" event={"ID":"1dc737f5-37f4-47a3-8716-af033cbe27fc","Type":"ContainerDied","Data":"6bf3b8f25815634bcc2b6e429f92701176838961f57da4fcf167a89026396f0f"} Oct 01 16:14:54 crc kubenswrapper[4764]: I1001 16:14:54.665514 4764 generic.go:334] "Generic (PLEG): container finished" podID="aedccabf-870c-4bca-9d09-028aa2416702" containerID="d7a75f11f4623f4255211cca3376549e5cdb9ab48ec5a134aa7014cc1e28103d" exitCode=0 Oct 01 16:14:54 crc kubenswrapper[4764]: I1001 16:14:54.665565 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" event={"ID":"aedccabf-870c-4bca-9d09-028aa2416702","Type":"ContainerDied","Data":"d7a75f11f4623f4255211cca3376549e5cdb9ab48ec5a134aa7014cc1e28103d"} Oct 01 16:14:55 crc kubenswrapper[4764]: I1001 16:14:55.242351 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-dq7mc" Oct 01 16:14:58 crc kubenswrapper[4764]: I1001 16:14:58.179340 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nvlvk"] Oct 01 16:14:58 crc kubenswrapper[4764]: I1001 16:14:58.181273 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nvlvk" Oct 01 16:14:58 crc kubenswrapper[4764]: I1001 16:14:58.186599 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 01 16:14:58 crc kubenswrapper[4764]: I1001 16:14:58.186691 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 01 16:14:58 crc kubenswrapper[4764]: I1001 16:14:58.200143 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nvlvk"] Oct 01 16:14:58 crc kubenswrapper[4764]: I1001 16:14:58.231033 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcgbz\" (UniqueName: \"kubernetes.io/projected/4884b6b6-de9a-4a43-b4b2-42ce6e6efa54-kube-api-access-hcgbz\") pod \"openstack-operator-index-nvlvk\" (UID: \"4884b6b6-de9a-4a43-b4b2-42ce6e6efa54\") " pod="openstack-operators/openstack-operator-index-nvlvk" Oct 01 16:14:58 crc kubenswrapper[4764]: I1001 16:14:58.332548 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcgbz\" (UniqueName: \"kubernetes.io/projected/4884b6b6-de9a-4a43-b4b2-42ce6e6efa54-kube-api-access-hcgbz\") pod \"openstack-operator-index-nvlvk\" (UID: \"4884b6b6-de9a-4a43-b4b2-42ce6e6efa54\") " pod="openstack-operators/openstack-operator-index-nvlvk" Oct 01 16:14:58 crc kubenswrapper[4764]: I1001 16:14:58.359947 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcgbz\" (UniqueName: \"kubernetes.io/projected/4884b6b6-de9a-4a43-b4b2-42ce6e6efa54-kube-api-access-hcgbz\") pod \"openstack-operator-index-nvlvk\" (UID: \"4884b6b6-de9a-4a43-b4b2-42ce6e6efa54\") " pod="openstack-operators/openstack-operator-index-nvlvk" Oct 01 16:14:58 crc kubenswrapper[4764]: I1001 16:14:58.498416 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nvlvk" Oct 01 16:14:58 crc kubenswrapper[4764]: I1001 16:14:58.748027 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nvlvk"] Oct 01 16:14:58 crc kubenswrapper[4764]: W1001 16:14:58.755605 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4884b6b6_de9a_4a43_b4b2_42ce6e6efa54.slice/crio-63d0cab02c0fcb4678b9e31b115a583ca773ec81fc51bddfc8572884e83e76d9 WatchSource:0}: Error finding container 63d0cab02c0fcb4678b9e31b115a583ca773ec81fc51bddfc8572884e83e76d9: Status 404 returned error can't find the container with id 63d0cab02c0fcb4678b9e31b115a583ca773ec81fc51bddfc8572884e83e76d9 Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.498406 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.540169 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp"] Oct 01 16:14:59 crc kubenswrapper[4764]: E1001 16:14:59.540418 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aedccabf-870c-4bca-9d09-028aa2416702" containerName="route-controller-manager" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.540435 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="aedccabf-870c-4bca-9d09-028aa2416702" containerName="route-controller-manager" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.540527 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="aedccabf-870c-4bca-9d09-028aa2416702" containerName="route-controller-manager" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.540871 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.553072 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp"] Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.601427 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-586jt\" (UniqueName: \"kubernetes.io/projected/aedccabf-870c-4bca-9d09-028aa2416702-kube-api-access-586jt\") pod \"aedccabf-870c-4bca-9d09-028aa2416702\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.601503 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aedccabf-870c-4bca-9d09-028aa2416702-client-ca\") pod \"aedccabf-870c-4bca-9d09-028aa2416702\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.601556 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aedccabf-870c-4bca-9d09-028aa2416702-config\") pod \"aedccabf-870c-4bca-9d09-028aa2416702\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.601582 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aedccabf-870c-4bca-9d09-028aa2416702-serving-cert\") pod \"aedccabf-870c-4bca-9d09-028aa2416702\" (UID: \"aedccabf-870c-4bca-9d09-028aa2416702\") " Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.602458 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aedccabf-870c-4bca-9d09-028aa2416702-client-ca" (OuterVolumeSpecName: "client-ca") pod "aedccabf-870c-4bca-9d09-028aa2416702" (UID: "aedccabf-870c-4bca-9d09-028aa2416702"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.603245 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aedccabf-870c-4bca-9d09-028aa2416702-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.606575 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aedccabf-870c-4bca-9d09-028aa2416702-config" (OuterVolumeSpecName: "config") pod "aedccabf-870c-4bca-9d09-028aa2416702" (UID: "aedccabf-870c-4bca-9d09-028aa2416702"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.607587 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aedccabf-870c-4bca-9d09-028aa2416702-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "aedccabf-870c-4bca-9d09-028aa2416702" (UID: "aedccabf-870c-4bca-9d09-028aa2416702"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.607716 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aedccabf-870c-4bca-9d09-028aa2416702-kube-api-access-586jt" (OuterVolumeSpecName: "kube-api-access-586jt") pod "aedccabf-870c-4bca-9d09-028aa2416702" (UID: "aedccabf-870c-4bca-9d09-028aa2416702"). InnerVolumeSpecName "kube-api-access-586jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.638214 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.697030 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.697022 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" event={"ID":"1dc737f5-37f4-47a3-8716-af033cbe27fc","Type":"ContainerDied","Data":"06ac98415128dc7aa7722425750012ad3abd4d776e9ce4dbcc2b3770eb1937d8"} Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.697217 4764 scope.go:117] "RemoveContainer" containerID="6bf3b8f25815634bcc2b6e429f92701176838961f57da4fcf167a89026396f0f" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.697990 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nvlvk" event={"ID":"4884b6b6-de9a-4a43-b4b2-42ce6e6efa54","Type":"ContainerStarted","Data":"63d0cab02c0fcb4678b9e31b115a583ca773ec81fc51bddfc8572884e83e76d9"} Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.699495 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" event={"ID":"aedccabf-870c-4bca-9d09-028aa2416702","Type":"ContainerDied","Data":"d1f386a5da4dc69ef3e50d7e450652b7c4085ce627ecf5b7cc9c837a60e1d5c0"} Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.699558 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.704300 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f83df15-f436-427b-ae62-661e7843cde2-serving-cert\") pod \"route-controller-manager-79c5cf8d95-46cgp\" (UID: \"2f83df15-f436-427b-ae62-661e7843cde2\") " pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.704357 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv8c7\" (UniqueName: \"kubernetes.io/projected/2f83df15-f436-427b-ae62-661e7843cde2-kube-api-access-mv8c7\") pod \"route-controller-manager-79c5cf8d95-46cgp\" (UID: \"2f83df15-f436-427b-ae62-661e7843cde2\") " pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.704465 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f83df15-f436-427b-ae62-661e7843cde2-config\") pod \"route-controller-manager-79c5cf8d95-46cgp\" (UID: \"2f83df15-f436-427b-ae62-661e7843cde2\") " pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.704518 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f83df15-f436-427b-ae62-661e7843cde2-client-ca\") pod \"route-controller-manager-79c5cf8d95-46cgp\" (UID: \"2f83df15-f436-427b-ae62-661e7843cde2\") " pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.704577 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aedccabf-870c-4bca-9d09-028aa2416702-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.704597 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aedccabf-870c-4bca-9d09-028aa2416702-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.704612 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-586jt\" (UniqueName: \"kubernetes.io/projected/aedccabf-870c-4bca-9d09-028aa2416702-kube-api-access-586jt\") on node \"crc\" DevicePath \"\"" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.717049 4764 scope.go:117] "RemoveContainer" containerID="d7a75f11f4623f4255211cca3376549e5cdb9ab48ec5a134aa7014cc1e28103d" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.744329 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc"] Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.744448 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n5qtc"] Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.806088 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-client-ca\") pod \"1dc737f5-37f4-47a3-8716-af033cbe27fc\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.806157 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-config\") pod \"1dc737f5-37f4-47a3-8716-af033cbe27fc\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.806207 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-proxy-ca-bundles\") pod \"1dc737f5-37f4-47a3-8716-af033cbe27fc\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.806264 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54tbr\" (UniqueName: \"kubernetes.io/projected/1dc737f5-37f4-47a3-8716-af033cbe27fc-kube-api-access-54tbr\") pod \"1dc737f5-37f4-47a3-8716-af033cbe27fc\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.806354 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dc737f5-37f4-47a3-8716-af033cbe27fc-serving-cert\") pod \"1dc737f5-37f4-47a3-8716-af033cbe27fc\" (UID: \"1dc737f5-37f4-47a3-8716-af033cbe27fc\") " Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.806635 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f83df15-f436-427b-ae62-661e7843cde2-serving-cert\") pod \"route-controller-manager-79c5cf8d95-46cgp\" (UID: \"2f83df15-f436-427b-ae62-661e7843cde2\") " pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.806725 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv8c7\" (UniqueName: \"kubernetes.io/projected/2f83df15-f436-427b-ae62-661e7843cde2-kube-api-access-mv8c7\") pod \"route-controller-manager-79c5cf8d95-46cgp\" (UID: \"2f83df15-f436-427b-ae62-661e7843cde2\") " pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.806823 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f83df15-f436-427b-ae62-661e7843cde2-config\") pod \"route-controller-manager-79c5cf8d95-46cgp\" (UID: \"2f83df15-f436-427b-ae62-661e7843cde2\") " pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.806883 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f83df15-f436-427b-ae62-661e7843cde2-client-ca\") pod \"route-controller-manager-79c5cf8d95-46cgp\" (UID: \"2f83df15-f436-427b-ae62-661e7843cde2\") " pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.806896 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1dc737f5-37f4-47a3-8716-af033cbe27fc" (UID: "1dc737f5-37f4-47a3-8716-af033cbe27fc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.807365 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-client-ca" (OuterVolumeSpecName: "client-ca") pod "1dc737f5-37f4-47a3-8716-af033cbe27fc" (UID: "1dc737f5-37f4-47a3-8716-af033cbe27fc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.807401 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-config" (OuterVolumeSpecName: "config") pod "1dc737f5-37f4-47a3-8716-af033cbe27fc" (UID: "1dc737f5-37f4-47a3-8716-af033cbe27fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.807786 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f83df15-f436-427b-ae62-661e7843cde2-client-ca\") pod \"route-controller-manager-79c5cf8d95-46cgp\" (UID: \"2f83df15-f436-427b-ae62-661e7843cde2\") " pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.809010 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f83df15-f436-427b-ae62-661e7843cde2-config\") pod \"route-controller-manager-79c5cf8d95-46cgp\" (UID: \"2f83df15-f436-427b-ae62-661e7843cde2\") " pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.811996 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc737f5-37f4-47a3-8716-af033cbe27fc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1dc737f5-37f4-47a3-8716-af033cbe27fc" (UID: "1dc737f5-37f4-47a3-8716-af033cbe27fc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.812132 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc737f5-37f4-47a3-8716-af033cbe27fc-kube-api-access-54tbr" (OuterVolumeSpecName: "kube-api-access-54tbr") pod "1dc737f5-37f4-47a3-8716-af033cbe27fc" (UID: "1dc737f5-37f4-47a3-8716-af033cbe27fc"). InnerVolumeSpecName "kube-api-access-54tbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.812398 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f83df15-f436-427b-ae62-661e7843cde2-serving-cert\") pod \"route-controller-manager-79c5cf8d95-46cgp\" (UID: \"2f83df15-f436-427b-ae62-661e7843cde2\") " pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.835479 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv8c7\" (UniqueName: \"kubernetes.io/projected/2f83df15-f436-427b-ae62-661e7843cde2-kube-api-access-mv8c7\") pod \"route-controller-manager-79c5cf8d95-46cgp\" (UID: \"2f83df15-f436-427b-ae62-661e7843cde2\") " pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.868871 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.908425 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.908820 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.908896 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1dc737f5-37f4-47a3-8716-af033cbe27fc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.909468 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54tbr\" (UniqueName: \"kubernetes.io/projected/1dc737f5-37f4-47a3-8716-af033cbe27fc-kube-api-access-54tbr\") on node \"crc\" DevicePath \"\"" Oct 01 16:14:59 crc kubenswrapper[4764]: I1001 16:14:59.909561 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dc737f5-37f4-47a3-8716-af033cbe27fc-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.023332 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6j7l4"] Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.030873 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6j7l4"] Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.092101 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp"] Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.129996 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn"] Oct 01 16:15:00 crc kubenswrapper[4764]: E1001 16:15:00.130383 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc737f5-37f4-47a3-8716-af033cbe27fc" containerName="controller-manager" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.130406 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc737f5-37f4-47a3-8716-af033cbe27fc" containerName="controller-manager" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.130539 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc737f5-37f4-47a3-8716-af033cbe27fc" containerName="controller-manager" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.131832 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.134169 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.134415 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.145017 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn"] Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.316806 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7b9604-377c-4942-8802-a802ab7f9783-config-volume\") pod \"collect-profiles-29322255-w4lzn\" (UID: \"3f7b9604-377c-4942-8802-a802ab7f9783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.316873 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw2hd\" (UniqueName: \"kubernetes.io/projected/3f7b9604-377c-4942-8802-a802ab7f9783-kube-api-access-rw2hd\") pod \"collect-profiles-29322255-w4lzn\" (UID: \"3f7b9604-377c-4942-8802-a802ab7f9783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.317038 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7b9604-377c-4942-8802-a802ab7f9783-secret-volume\") pod \"collect-profiles-29322255-w4lzn\" (UID: \"3f7b9604-377c-4942-8802-a802ab7f9783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.418454 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7b9604-377c-4942-8802-a802ab7f9783-secret-volume\") pod \"collect-profiles-29322255-w4lzn\" (UID: \"3f7b9604-377c-4942-8802-a802ab7f9783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.418621 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7b9604-377c-4942-8802-a802ab7f9783-config-volume\") pod \"collect-profiles-29322255-w4lzn\" (UID: \"3f7b9604-377c-4942-8802-a802ab7f9783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.418736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw2hd\" (UniqueName: \"kubernetes.io/projected/3f7b9604-377c-4942-8802-a802ab7f9783-kube-api-access-rw2hd\") pod \"collect-profiles-29322255-w4lzn\" (UID: \"3f7b9604-377c-4942-8802-a802ab7f9783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.420346 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7b9604-377c-4942-8802-a802ab7f9783-config-volume\") pod \"collect-profiles-29322255-w4lzn\" (UID: \"3f7b9604-377c-4942-8802-a802ab7f9783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.425521 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7b9604-377c-4942-8802-a802ab7f9783-secret-volume\") pod \"collect-profiles-29322255-w4lzn\" (UID: \"3f7b9604-377c-4942-8802-a802ab7f9783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.442465 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw2hd\" (UniqueName: \"kubernetes.io/projected/3f7b9604-377c-4942-8802-a802ab7f9783-kube-api-access-rw2hd\") pod \"collect-profiles-29322255-w4lzn\" (UID: \"3f7b9604-377c-4942-8802-a802ab7f9783\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.463746 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.608601 4764 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6j7l4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.608703 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6j7l4" podUID="1dc737f5-37f4-47a3-8716-af033cbe27fc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.706398 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn"] Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.710539 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" event={"ID":"2f83df15-f436-427b-ae62-661e7843cde2","Type":"ContainerStarted","Data":"ca0e6b849a1e7d0d89d959cb3bf9fb1ddea07307428beb5f122ef6068665f78f"} Oct 01 16:15:00 crc kubenswrapper[4764]: I1001 16:15:00.940037 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nvlvk"] Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.548995 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dx2zz"] Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.550528 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dx2zz" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.560019 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dx2zz"] Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.635462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jff27\" (UniqueName: \"kubernetes.io/projected/539d5390-0098-46b3-b552-9c195919b57b-kube-api-access-jff27\") pod \"openstack-operator-index-dx2zz\" (UID: \"539d5390-0098-46b3-b552-9c195919b57b\") " pod="openstack-operators/openstack-operator-index-dx2zz" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.701433 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t"] Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.702982 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.706174 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.706655 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.706836 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.706931 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.707782 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.708193 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.713688 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t"] Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.717019 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.731389 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dc737f5-37f4-47a3-8716-af033cbe27fc" path="/var/lib/kubelet/pods/1dc737f5-37f4-47a3-8716-af033cbe27fc/volumes" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.732853 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aedccabf-870c-4bca-9d09-028aa2416702" path="/var/lib/kubelet/pods/aedccabf-870c-4bca-9d09-028aa2416702/volumes" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.734899 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" event={"ID":"2f83df15-f436-427b-ae62-661e7843cde2","Type":"ContainerStarted","Data":"0e852772eeb9f77d08badc671895bc998b943bd72edfb032abcc8e2ebb990a2c"} Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.734936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" event={"ID":"3f7b9604-377c-4942-8802-a802ab7f9783","Type":"ContainerStarted","Data":"61766ee2c2e081c011af633b6db84d22f907ab8d5c8a89dbdd4965c5640a1090"} Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.737310 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f38df15-a2b7-4198-8c71-a025551fc6cf-client-ca\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.737357 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f38df15-a2b7-4198-8c71-a025551fc6cf-proxy-ca-bundles\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.737443 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f38df15-a2b7-4198-8c71-a025551fc6cf-config\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.737561 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q69f\" (UniqueName: \"kubernetes.io/projected/4f38df15-a2b7-4198-8c71-a025551fc6cf-kube-api-access-5q69f\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.737607 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f38df15-a2b7-4198-8c71-a025551fc6cf-serving-cert\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.737733 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jff27\" (UniqueName: \"kubernetes.io/projected/539d5390-0098-46b3-b552-9c195919b57b-kube-api-access-jff27\") pod \"openstack-operator-index-dx2zz\" (UID: \"539d5390-0098-46b3-b552-9c195919b57b\") " pod="openstack-operators/openstack-operator-index-dx2zz" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.768181 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jff27\" (UniqueName: \"kubernetes.io/projected/539d5390-0098-46b3-b552-9c195919b57b-kube-api-access-jff27\") pod \"openstack-operator-index-dx2zz\" (UID: \"539d5390-0098-46b3-b552-9c195919b57b\") " pod="openstack-operators/openstack-operator-index-dx2zz" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.839005 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f38df15-a2b7-4198-8c71-a025551fc6cf-client-ca\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.839119 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f38df15-a2b7-4198-8c71-a025551fc6cf-proxy-ca-bundles\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.839206 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f38df15-a2b7-4198-8c71-a025551fc6cf-config\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.839261 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q69f\" (UniqueName: \"kubernetes.io/projected/4f38df15-a2b7-4198-8c71-a025551fc6cf-kube-api-access-5q69f\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.839303 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f38df15-a2b7-4198-8c71-a025551fc6cf-serving-cert\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.840183 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f38df15-a2b7-4198-8c71-a025551fc6cf-client-ca\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.840753 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f38df15-a2b7-4198-8c71-a025551fc6cf-config\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.840960 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f38df15-a2b7-4198-8c71-a025551fc6cf-proxy-ca-bundles\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.844957 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f38df15-a2b7-4198-8c71-a025551fc6cf-serving-cert\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.868910 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q69f\" (UniqueName: \"kubernetes.io/projected/4f38df15-a2b7-4198-8c71-a025551fc6cf-kube-api-access-5q69f\") pod \"controller-manager-6fcb47c88f-hxk8t\" (UID: \"4f38df15-a2b7-4198-8c71-a025551fc6cf\") " pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:01 crc kubenswrapper[4764]: I1001 16:15:01.878330 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dx2zz" Oct 01 16:15:02 crc kubenswrapper[4764]: I1001 16:15:02.041835 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:02 crc kubenswrapper[4764]: E1001 16:15:02.043004 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/frr-rhel9@sha256:a0db688989ed590c75c722e2891572112a1f5b1f714f894ea814b7026cd9adb4" Oct 01 16:15:02 crc kubenswrapper[4764]: E1001 16:15:02.043192 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:cp-frr-files,Image:registry.redhat.io/openshift4/frr-rhel9@sha256:a0db688989ed590c75c722e2891572112a1f5b1f714f894ea814b7026cd9adb4,Command:[/bin/sh -c cp -rLf /tmp/frr/* /etc/frr/],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:frr-startup,ReadOnly:false,MountPath:/tmp/frr,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:frr-conf,ReadOnly:false,MountPath:/etc/frr,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2djw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*100,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*101,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod frr-k8s-6tr29_metallb-system(b922335a-fbf4-41ec-99b0-dafbd8b24bf5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 16:15:02 crc kubenswrapper[4764]: E1001 16:15:02.046175 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cp-frr-files\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="metallb-system/frr-k8s-6tr29" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" Oct 01 16:15:02 crc kubenswrapper[4764]: I1001 16:15:02.342671 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dx2zz"] Oct 01 16:15:02 crc kubenswrapper[4764]: W1001 16:15:02.354208 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod539d5390_0098_46b3_b552_9c195919b57b.slice/crio-a135ff954e3a5fbc00a4ddaf01bf1b9dfcd6273dd5984aabf2e9c534a3e3cede WatchSource:0}: Error finding container a135ff954e3a5fbc00a4ddaf01bf1b9dfcd6273dd5984aabf2e9c534a3e3cede: Status 404 returned error can't find the container with id a135ff954e3a5fbc00a4ddaf01bf1b9dfcd6273dd5984aabf2e9c534a3e3cede Oct 01 16:15:02 crc kubenswrapper[4764]: I1001 16:15:02.468168 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t"] Oct 01 16:15:02 crc kubenswrapper[4764]: W1001 16:15:02.471829 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f38df15_a2b7_4198_8c71_a025551fc6cf.slice/crio-dea6d9a30af744a13a87121c19347c4410a9d299493581c69aaf7b76e66724fe WatchSource:0}: Error finding container dea6d9a30af744a13a87121c19347c4410a9d299493581c69aaf7b76e66724fe: Status 404 returned error can't find the container with id dea6d9a30af744a13a87121c19347c4410a9d299493581c69aaf7b76e66724fe Oct 01 16:15:02 crc kubenswrapper[4764]: I1001 16:15:02.730756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" event={"ID":"3f7b9604-377c-4942-8802-a802ab7f9783","Type":"ContainerStarted","Data":"97b9dc84f2670f9df086ec1f1029c96c15dd629cdc28af00b32f5e084c9ad69b"} Oct 01 16:15:02 crc kubenswrapper[4764]: I1001 16:15:02.733867 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" event={"ID":"4f38df15-a2b7-4198-8c71-a025551fc6cf","Type":"ContainerStarted","Data":"dea6d9a30af744a13a87121c19347c4410a9d299493581c69aaf7b76e66724fe"} Oct 01 16:15:02 crc kubenswrapper[4764]: I1001 16:15:02.736219 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dx2zz" event={"ID":"539d5390-0098-46b3-b552-9c195919b57b","Type":"ContainerStarted","Data":"a135ff954e3a5fbc00a4ddaf01bf1b9dfcd6273dd5984aabf2e9c534a3e3cede"} Oct 01 16:15:02 crc kubenswrapper[4764]: I1001 16:15:02.736482 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:15:02 crc kubenswrapper[4764]: E1001 16:15:02.738305 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cp-frr-files\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/frr-rhel9@sha256:a0db688989ed590c75c722e2891572112a1f5b1f714f894ea814b7026cd9adb4\\\"\"" pod="metallb-system/frr-k8s-6tr29" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" Oct 01 16:15:02 crc kubenswrapper[4764]: I1001 16:15:02.746261 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" Oct 01 16:15:02 crc kubenswrapper[4764]: I1001 16:15:02.754735 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" podStartSLOduration=2.754720567 podStartE2EDuration="2.754720567s" podCreationTimestamp="2025-10-01 16:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:15:02.752768578 +0000 UTC m=+765.752415433" watchObservedRunningTime="2025-10-01 16:15:02.754720567 +0000 UTC m=+765.754367402" Oct 01 16:15:02 crc kubenswrapper[4764]: I1001 16:15:02.779562 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79c5cf8d95-46cgp" podStartSLOduration=11.779550521 podStartE2EDuration="11.779550521s" podCreationTimestamp="2025-10-01 16:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:15:02.779228743 +0000 UTC m=+765.778875578" watchObservedRunningTime="2025-10-01 16:15:02.779550521 +0000 UTC m=+765.779197356" Oct 01 16:15:03 crc kubenswrapper[4764]: I1001 16:15:03.588325 4764 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 16:15:03 crc kubenswrapper[4764]: I1001 16:15:03.762888 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" event={"ID":"4f38df15-a2b7-4198-8c71-a025551fc6cf","Type":"ContainerStarted","Data":"717161976feb4ecd5581d3e76398f4b525571d440da5c2cfdeee05608fb930be"} Oct 01 16:15:03 crc kubenswrapper[4764]: I1001 16:15:03.788700 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:15:04 crc kubenswrapper[4764]: E1001 16:15:04.663718 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/frr-rhel9@sha256:a0db688989ed590c75c722e2891572112a1f5b1f714f894ea814b7026cd9adb4" Oct 01 16:15:04 crc kubenswrapper[4764]: E1001 16:15:04.663908 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:frr-k8s-webhook-server,Image:registry.redhat.io/openshift4/frr-rhel9@sha256:a0db688989ed590c75c722e2891572112a1f5b1f714f894ea814b7026cd9adb4,Command:[/frr-k8s],Args:[--log-level=debug --webhook-mode=onlywebhook --disable-cert-rotation=true --namespace=$(NAMESPACE) --metrics-bind-address=:7572],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:monitoring,HostPort:0,ContainerPort:7572,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-trlhl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod frr-k8s-webhook-server-5478bdb765-j2wj8_metallb-system(6764cb75-6b7f-45aa-a7b7-6347f50022f7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 16:15:04 crc kubenswrapper[4764]: E1001 16:15:04.665180 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"frr-k8s-webhook-server\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" podUID="6764cb75-6b7f-45aa-a7b7-6347f50022f7" Oct 01 16:15:04 crc kubenswrapper[4764]: I1001 16:15:04.771633 4764 generic.go:334] "Generic (PLEG): container finished" podID="3f7b9604-377c-4942-8802-a802ab7f9783" containerID="97b9dc84f2670f9df086ec1f1029c96c15dd629cdc28af00b32f5e084c9ad69b" exitCode=0 Oct 01 16:15:04 crc kubenswrapper[4764]: I1001 16:15:04.772393 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" event={"ID":"3f7b9604-377c-4942-8802-a802ab7f9783","Type":"ContainerDied","Data":"97b9dc84f2670f9df086ec1f1029c96c15dd629cdc28af00b32f5e084c9ad69b"} Oct 01 16:15:04 crc kubenswrapper[4764]: I1001 16:15:04.773753 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:04 crc kubenswrapper[4764]: E1001 16:15:04.776014 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"frr-k8s-webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/frr-rhel9@sha256:a0db688989ed590c75c722e2891572112a1f5b1f714f894ea814b7026cd9adb4\\\"\"" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" podUID="6764cb75-6b7f-45aa-a7b7-6347f50022f7" Oct 01 16:15:04 crc kubenswrapper[4764]: I1001 16:15:04.781004 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" Oct 01 16:15:04 crc kubenswrapper[4764]: I1001 16:15:04.831281 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6fcb47c88f-hxk8t" podStartSLOduration=13.831255295 podStartE2EDuration="13.831255295s" podCreationTimestamp="2025-10-01 16:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:15:04.822558409 +0000 UTC m=+767.822205254" watchObservedRunningTime="2025-10-01 16:15:04.831255295 +0000 UTC m=+767.830902140" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.540503 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dx2zz"] Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.625167 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.723135 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7b9604-377c-4942-8802-a802ab7f9783-config-volume\") pod \"3f7b9604-377c-4942-8802-a802ab7f9783\" (UID: \"3f7b9604-377c-4942-8802-a802ab7f9783\") " Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.723283 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw2hd\" (UniqueName: \"kubernetes.io/projected/3f7b9604-377c-4942-8802-a802ab7f9783-kube-api-access-rw2hd\") pod \"3f7b9604-377c-4942-8802-a802ab7f9783\" (UID: \"3f7b9604-377c-4942-8802-a802ab7f9783\") " Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.724397 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f7b9604-377c-4942-8802-a802ab7f9783-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f7b9604-377c-4942-8802-a802ab7f9783" (UID: "3f7b9604-377c-4942-8802-a802ab7f9783"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.724617 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7b9604-377c-4942-8802-a802ab7f9783-secret-volume\") pod \"3f7b9604-377c-4942-8802-a802ab7f9783\" (UID: \"3f7b9604-377c-4942-8802-a802ab7f9783\") " Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.724918 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7b9604-377c-4942-8802-a802ab7f9783-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.732240 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f7b9604-377c-4942-8802-a802ab7f9783-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f7b9604-377c-4942-8802-a802ab7f9783" (UID: "3f7b9604-377c-4942-8802-a802ab7f9783"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.734704 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7b9604-377c-4942-8802-a802ab7f9783-kube-api-access-rw2hd" (OuterVolumeSpecName: "kube-api-access-rw2hd") pod "3f7b9604-377c-4942-8802-a802ab7f9783" (UID: "3f7b9604-377c-4942-8802-a802ab7f9783"). InnerVolumeSpecName "kube-api-access-rw2hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.796695 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" event={"ID":"3f7b9604-377c-4942-8802-a802ab7f9783","Type":"ContainerDied","Data":"61766ee2c2e081c011af633b6db84d22f907ab8d5c8a89dbdd4965c5640a1090"} Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.796737 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.796768 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61766ee2c2e081c011af633b6db84d22f907ab8d5c8a89dbdd4965c5640a1090" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.831707 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw2hd\" (UniqueName: \"kubernetes.io/projected/3f7b9604-377c-4942-8802-a802ab7f9783-kube-api-access-rw2hd\") on node \"crc\" DevicePath \"\"" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.831823 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f7b9604-377c-4942-8802-a802ab7f9783-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.950439 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-275hc"] Oct 01 16:15:06 crc kubenswrapper[4764]: E1001 16:15:06.950705 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7b9604-377c-4942-8802-a802ab7f9783" containerName="collect-profiles" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.950720 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7b9604-377c-4942-8802-a802ab7f9783" containerName="collect-profiles" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.950816 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f7b9604-377c-4942-8802-a802ab7f9783" containerName="collect-profiles" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.951248 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-275hc" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.953455 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-j2jnr" Oct 01 16:15:06 crc kubenswrapper[4764]: I1001 16:15:06.957151 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-275hc"] Oct 01 16:15:07 crc kubenswrapper[4764]: I1001 16:15:07.036616 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqrjb\" (UniqueName: \"kubernetes.io/projected/17f0747f-d044-4d4f-bdad-937743cfb537-kube-api-access-nqrjb\") pod \"openstack-operator-index-275hc\" (UID: \"17f0747f-d044-4d4f-bdad-937743cfb537\") " pod="openstack-operators/openstack-operator-index-275hc" Oct 01 16:15:07 crc kubenswrapper[4764]: I1001 16:15:07.138784 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqrjb\" (UniqueName: \"kubernetes.io/projected/17f0747f-d044-4d4f-bdad-937743cfb537-kube-api-access-nqrjb\") pod \"openstack-operator-index-275hc\" (UID: \"17f0747f-d044-4d4f-bdad-937743cfb537\") " pod="openstack-operators/openstack-operator-index-275hc" Oct 01 16:15:07 crc kubenswrapper[4764]: I1001 16:15:07.157167 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqrjb\" (UniqueName: \"kubernetes.io/projected/17f0747f-d044-4d4f-bdad-937743cfb537-kube-api-access-nqrjb\") pod \"openstack-operator-index-275hc\" (UID: \"17f0747f-d044-4d4f-bdad-937743cfb537\") " pod="openstack-operators/openstack-operator-index-275hc" Oct 01 16:15:07 crc kubenswrapper[4764]: I1001 16:15:07.300966 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-275hc" Oct 01 16:15:08 crc kubenswrapper[4764]: I1001 16:15:08.900380 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-275hc"] Oct 01 16:15:09 crc kubenswrapper[4764]: I1001 16:15:09.826759 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-275hc" event={"ID":"17f0747f-d044-4d4f-bdad-937743cfb537","Type":"ContainerStarted","Data":"dc55e8b1534aeb35c24b78d4e98630fad77f4714ba05c52b6b60d60276a5ef11"} Oct 01 16:15:11 crc kubenswrapper[4764]: I1001 16:15:11.855322 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-275hc" event={"ID":"17f0747f-d044-4d4f-bdad-937743cfb537","Type":"ContainerStarted","Data":"f0b93863ffd44c762721e462f7b9d109b87d8adc129880cf0ac614051ec312fe"} Oct 01 16:15:11 crc kubenswrapper[4764]: I1001 16:15:11.858396 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nvlvk" event={"ID":"4884b6b6-de9a-4a43-b4b2-42ce6e6efa54","Type":"ContainerStarted","Data":"714404a2575e1f7f3f6303b4b8d0765b928d77bc3bc59139fc0a333725dc3964"} Oct 01 16:15:11 crc kubenswrapper[4764]: I1001 16:15:11.858690 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-nvlvk" podUID="4884b6b6-de9a-4a43-b4b2-42ce6e6efa54" containerName="registry-server" containerID="cri-o://714404a2575e1f7f3f6303b4b8d0765b928d77bc3bc59139fc0a333725dc3964" gracePeriod=2 Oct 01 16:15:11 crc kubenswrapper[4764]: I1001 16:15:11.865417 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dx2zz" event={"ID":"539d5390-0098-46b3-b552-9c195919b57b","Type":"ContainerStarted","Data":"db021384a0a22c294a51efdff321e59fa68b60832eece782c4689c1c8e14bdaf"} Oct 01 16:15:11 crc kubenswrapper[4764]: I1001 16:15:11.865755 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-dx2zz" podUID="539d5390-0098-46b3-b552-9c195919b57b" containerName="registry-server" containerID="cri-o://db021384a0a22c294a51efdff321e59fa68b60832eece782c4689c1c8e14bdaf" gracePeriod=2 Oct 01 16:15:11 crc kubenswrapper[4764]: I1001 16:15:11.881739 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-dx2zz" Oct 01 16:15:11 crc kubenswrapper[4764]: I1001 16:15:11.892427 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-275hc" podStartSLOduration=3.791879675 podStartE2EDuration="5.892399968s" podCreationTimestamp="2025-10-01 16:15:06 +0000 UTC" firstStartedPulling="2025-10-01 16:15:09.042214095 +0000 UTC m=+772.041860930" lastFinishedPulling="2025-10-01 16:15:11.142734388 +0000 UTC m=+774.142381223" observedRunningTime="2025-10-01 16:15:11.882395931 +0000 UTC m=+774.882042786" watchObservedRunningTime="2025-10-01 16:15:11.892399968 +0000 UTC m=+774.892046843" Oct 01 16:15:11 crc kubenswrapper[4764]: I1001 16:15:11.911357 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nvlvk" podStartSLOduration=1.525599493 podStartE2EDuration="13.911338086s" podCreationTimestamp="2025-10-01 16:14:58 +0000 UTC" firstStartedPulling="2025-10-01 16:14:58.757506858 +0000 UTC m=+761.757153693" lastFinishedPulling="2025-10-01 16:15:11.143245451 +0000 UTC m=+774.142892286" observedRunningTime="2025-10-01 16:15:11.90096694 +0000 UTC m=+774.900613785" watchObservedRunningTime="2025-10-01 16:15:11.911338086 +0000 UTC m=+774.910984931" Oct 01 16:15:11 crc kubenswrapper[4764]: I1001 16:15:11.927748 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dx2zz" podStartSLOduration=2.039444215 podStartE2EDuration="10.927727232s" podCreationTimestamp="2025-10-01 16:15:01 +0000 UTC" firstStartedPulling="2025-10-01 16:15:02.357276187 +0000 UTC m=+765.356923022" lastFinishedPulling="2025-10-01 16:15:11.245559184 +0000 UTC m=+774.245206039" observedRunningTime="2025-10-01 16:15:11.919859207 +0000 UTC m=+774.919506052" watchObservedRunningTime="2025-10-01 16:15:11.927727232 +0000 UTC m=+774.927374077" Oct 01 16:15:12 crc kubenswrapper[4764]: I1001 16:15:12.872927 4764 generic.go:334] "Generic (PLEG): container finished" podID="539d5390-0098-46b3-b552-9c195919b57b" containerID="db021384a0a22c294a51efdff321e59fa68b60832eece782c4689c1c8e14bdaf" exitCode=0 Oct 01 16:15:12 crc kubenswrapper[4764]: I1001 16:15:12.872996 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dx2zz" event={"ID":"539d5390-0098-46b3-b552-9c195919b57b","Type":"ContainerDied","Data":"db021384a0a22c294a51efdff321e59fa68b60832eece782c4689c1c8e14bdaf"} Oct 01 16:15:12 crc kubenswrapper[4764]: I1001 16:15:12.875227 4764 generic.go:334] "Generic (PLEG): container finished" podID="4884b6b6-de9a-4a43-b4b2-42ce6e6efa54" containerID="714404a2575e1f7f3f6303b4b8d0765b928d77bc3bc59139fc0a333725dc3964" exitCode=0 Oct 01 16:15:12 crc kubenswrapper[4764]: I1001 16:15:12.875271 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nvlvk" event={"ID":"4884b6b6-de9a-4a43-b4b2-42ce6e6efa54","Type":"ContainerDied","Data":"714404a2575e1f7f3f6303b4b8d0765b928d77bc3bc59139fc0a333725dc3964"} Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.018439 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nvlvk" Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.024425 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dx2zz" Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.126697 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcgbz\" (UniqueName: \"kubernetes.io/projected/4884b6b6-de9a-4a43-b4b2-42ce6e6efa54-kube-api-access-hcgbz\") pod \"4884b6b6-de9a-4a43-b4b2-42ce6e6efa54\" (UID: \"4884b6b6-de9a-4a43-b4b2-42ce6e6efa54\") " Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.126846 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jff27\" (UniqueName: \"kubernetes.io/projected/539d5390-0098-46b3-b552-9c195919b57b-kube-api-access-jff27\") pod \"539d5390-0098-46b3-b552-9c195919b57b\" (UID: \"539d5390-0098-46b3-b552-9c195919b57b\") " Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.133345 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/539d5390-0098-46b3-b552-9c195919b57b-kube-api-access-jff27" (OuterVolumeSpecName: "kube-api-access-jff27") pod "539d5390-0098-46b3-b552-9c195919b57b" (UID: "539d5390-0098-46b3-b552-9c195919b57b"). InnerVolumeSpecName "kube-api-access-jff27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.133708 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4884b6b6-de9a-4a43-b4b2-42ce6e6efa54-kube-api-access-hcgbz" (OuterVolumeSpecName: "kube-api-access-hcgbz") pod "4884b6b6-de9a-4a43-b4b2-42ce6e6efa54" (UID: "4884b6b6-de9a-4a43-b4b2-42ce6e6efa54"). InnerVolumeSpecName "kube-api-access-hcgbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.229456 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcgbz\" (UniqueName: \"kubernetes.io/projected/4884b6b6-de9a-4a43-b4b2-42ce6e6efa54-kube-api-access-hcgbz\") on node \"crc\" DevicePath \"\"" Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.229496 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jff27\" (UniqueName: \"kubernetes.io/projected/539d5390-0098-46b3-b552-9c195919b57b-kube-api-access-jff27\") on node \"crc\" DevicePath \"\"" Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.885443 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dx2zz" Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.885491 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dx2zz" event={"ID":"539d5390-0098-46b3-b552-9c195919b57b","Type":"ContainerDied","Data":"a135ff954e3a5fbc00a4ddaf01bf1b9dfcd6273dd5984aabf2e9c534a3e3cede"} Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.886031 4764 scope.go:117] "RemoveContainer" containerID="db021384a0a22c294a51efdff321e59fa68b60832eece782c4689c1c8e14bdaf" Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.888675 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nvlvk" event={"ID":"4884b6b6-de9a-4a43-b4b2-42ce6e6efa54","Type":"ContainerDied","Data":"63d0cab02c0fcb4678b9e31b115a583ca773ec81fc51bddfc8572884e83e76d9"} Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.888740 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nvlvk" Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.920720 4764 scope.go:117] "RemoveContainer" containerID="714404a2575e1f7f3f6303b4b8d0765b928d77bc3bc59139fc0a333725dc3964" Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.926260 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dx2zz"] Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.936537 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-dx2zz"] Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.944272 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nvlvk"] Oct 01 16:15:13 crc kubenswrapper[4764]: I1001 16:15:13.950506 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-nvlvk"] Oct 01 16:15:15 crc kubenswrapper[4764]: I1001 16:15:15.735976 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4884b6b6-de9a-4a43-b4b2-42ce6e6efa54" path="/var/lib/kubelet/pods/4884b6b6-de9a-4a43-b4b2-42ce6e6efa54/volumes" Oct 01 16:15:15 crc kubenswrapper[4764]: I1001 16:15:15.736912 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="539d5390-0098-46b3-b552-9c195919b57b" path="/var/lib/kubelet/pods/539d5390-0098-46b3-b552-9c195919b57b/volumes" Oct 01 16:15:16 crc kubenswrapper[4764]: I1001 16:15:16.913211 4764 generic.go:334] "Generic (PLEG): container finished" podID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerID="94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b" exitCode=0 Oct 01 16:15:16 crc kubenswrapper[4764]: I1001 16:15:16.913263 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6tr29" event={"ID":"b922335a-fbf4-41ec-99b0-dafbd8b24bf5","Type":"ContainerDied","Data":"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b"} Oct 01 16:15:17 crc kubenswrapper[4764]: E1001 16:15:17.171403 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb922335a_fbf4_41ec_99b0_dafbd8b24bf5.slice/crio-conmon-019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f.scope\": RecentStats: unable to find data in memory cache]" Oct 01 16:15:17 crc kubenswrapper[4764]: I1001 16:15:17.301614 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-275hc" Oct 01 16:15:17 crc kubenswrapper[4764]: I1001 16:15:17.301775 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-275hc" Oct 01 16:15:17 crc kubenswrapper[4764]: I1001 16:15:17.334117 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-275hc" Oct 01 16:15:17 crc kubenswrapper[4764]: I1001 16:15:17.924413 4764 generic.go:334] "Generic (PLEG): container finished" podID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerID="019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f" exitCode=0 Oct 01 16:15:17 crc kubenswrapper[4764]: I1001 16:15:17.924579 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6tr29" event={"ID":"b922335a-fbf4-41ec-99b0-dafbd8b24bf5","Type":"ContainerDied","Data":"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f"} Oct 01 16:15:17 crc kubenswrapper[4764]: I1001 16:15:17.929146 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" event={"ID":"6764cb75-6b7f-45aa-a7b7-6347f50022f7","Type":"ContainerStarted","Data":"22320a8a7ec765011456e3e067bc479aba9a9bde038d8280fde82ad3c498cc4c"} Oct 01 16:15:17 crc kubenswrapper[4764]: I1001 16:15:17.929621 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" Oct 01 16:15:17 crc kubenswrapper[4764]: I1001 16:15:17.980348 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-275hc" Oct 01 16:15:17 crc kubenswrapper[4764]: I1001 16:15:17.987501 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" podStartSLOduration=-9223372001.867308 podStartE2EDuration="34.987468433s" podCreationTimestamp="2025-10-01 16:14:43 +0000 UTC" firstStartedPulling="2025-10-01 16:14:44.185397306 +0000 UTC m=+747.185044161" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:15:17.986513039 +0000 UTC m=+780.986159904" watchObservedRunningTime="2025-10-01 16:15:17.987468433 +0000 UTC m=+780.987115308" Oct 01 16:15:18 crc kubenswrapper[4764]: I1001 16:15:18.946463 4764 generic.go:334] "Generic (PLEG): container finished" podID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerID="62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3" exitCode=0 Oct 01 16:15:18 crc kubenswrapper[4764]: I1001 16:15:18.946536 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6tr29" event={"ID":"b922335a-fbf4-41ec-99b0-dafbd8b24bf5","Type":"ContainerDied","Data":"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3"} Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.178182 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5"] Oct 01 16:15:19 crc kubenswrapper[4764]: E1001 16:15:19.178404 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4884b6b6-de9a-4a43-b4b2-42ce6e6efa54" containerName="registry-server" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.178415 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4884b6b6-de9a-4a43-b4b2-42ce6e6efa54" containerName="registry-server" Oct 01 16:15:19 crc kubenswrapper[4764]: E1001 16:15:19.178426 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="539d5390-0098-46b3-b552-9c195919b57b" containerName="registry-server" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.178432 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="539d5390-0098-46b3-b552-9c195919b57b" containerName="registry-server" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.178541 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4884b6b6-de9a-4a43-b4b2-42ce6e6efa54" containerName="registry-server" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.178553 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="539d5390-0098-46b3-b552-9c195919b57b" containerName="registry-server" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.179400 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.181029 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-74xhp" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.188936 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5"] Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.208288 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88jdr\" (UniqueName: \"kubernetes.io/projected/9944bc89-7591-4fe0-81a2-41dba5a75f37-kube-api-access-88jdr\") pod \"4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5\" (UID: \"9944bc89-7591-4fe0-81a2-41dba5a75f37\") " pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.208321 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9944bc89-7591-4fe0-81a2-41dba5a75f37-util\") pod \"4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5\" (UID: \"9944bc89-7591-4fe0-81a2-41dba5a75f37\") " pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.208349 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9944bc89-7591-4fe0-81a2-41dba5a75f37-bundle\") pod \"4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5\" (UID: \"9944bc89-7591-4fe0-81a2-41dba5a75f37\") " pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.310028 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88jdr\" (UniqueName: \"kubernetes.io/projected/9944bc89-7591-4fe0-81a2-41dba5a75f37-kube-api-access-88jdr\") pod \"4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5\" (UID: \"9944bc89-7591-4fe0-81a2-41dba5a75f37\") " pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.310125 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9944bc89-7591-4fe0-81a2-41dba5a75f37-util\") pod \"4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5\" (UID: \"9944bc89-7591-4fe0-81a2-41dba5a75f37\") " pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.310174 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9944bc89-7591-4fe0-81a2-41dba5a75f37-bundle\") pod \"4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5\" (UID: \"9944bc89-7591-4fe0-81a2-41dba5a75f37\") " pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.310909 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9944bc89-7591-4fe0-81a2-41dba5a75f37-util\") pod \"4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5\" (UID: \"9944bc89-7591-4fe0-81a2-41dba5a75f37\") " pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.310983 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9944bc89-7591-4fe0-81a2-41dba5a75f37-bundle\") pod \"4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5\" (UID: \"9944bc89-7591-4fe0-81a2-41dba5a75f37\") " pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.333182 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88jdr\" (UniqueName: \"kubernetes.io/projected/9944bc89-7591-4fe0-81a2-41dba5a75f37-kube-api-access-88jdr\") pod \"4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5\" (UID: \"9944bc89-7591-4fe0-81a2-41dba5a75f37\") " pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.508468 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.970911 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6tr29" event={"ID":"b922335a-fbf4-41ec-99b0-dafbd8b24bf5","Type":"ContainerStarted","Data":"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0"} Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.971734 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6tr29" event={"ID":"b922335a-fbf4-41ec-99b0-dafbd8b24bf5","Type":"ContainerStarted","Data":"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76"} Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.971758 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6tr29" event={"ID":"b922335a-fbf4-41ec-99b0-dafbd8b24bf5","Type":"ContainerStarted","Data":"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094"} Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.971777 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6tr29" event={"ID":"b922335a-fbf4-41ec-99b0-dafbd8b24bf5","Type":"ContainerStarted","Data":"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328"} Oct 01 16:15:19 crc kubenswrapper[4764]: I1001 16:15:19.985912 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5"] Oct 01 16:15:19 crc kubenswrapper[4764]: W1001 16:15:19.998278 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9944bc89_7591_4fe0_81a2_41dba5a75f37.slice/crio-d42511e4a28b42dca6977da6b0a6be33ef3c38c24da1fae8b595e9f39a8fadae WatchSource:0}: Error finding container d42511e4a28b42dca6977da6b0a6be33ef3c38c24da1fae8b595e9f39a8fadae: Status 404 returned error can't find the container with id d42511e4a28b42dca6977da6b0a6be33ef3c38c24da1fae8b595e9f39a8fadae Oct 01 16:15:20 crc kubenswrapper[4764]: I1001 16:15:20.983699 4764 generic.go:334] "Generic (PLEG): container finished" podID="9944bc89-7591-4fe0-81a2-41dba5a75f37" containerID="383cbb74959c3b779fc38a3bae4bcbaeb80867d1e94f30179746d65a5af871b5" exitCode=0 Oct 01 16:15:20 crc kubenswrapper[4764]: I1001 16:15:20.983880 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" event={"ID":"9944bc89-7591-4fe0-81a2-41dba5a75f37","Type":"ContainerDied","Data":"383cbb74959c3b779fc38a3bae4bcbaeb80867d1e94f30179746d65a5af871b5"} Oct 01 16:15:20 crc kubenswrapper[4764]: I1001 16:15:20.983932 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" event={"ID":"9944bc89-7591-4fe0-81a2-41dba5a75f37","Type":"ContainerStarted","Data":"d42511e4a28b42dca6977da6b0a6be33ef3c38c24da1fae8b595e9f39a8fadae"} Oct 01 16:15:20 crc kubenswrapper[4764]: I1001 16:15:20.996560 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6tr29" event={"ID":"b922335a-fbf4-41ec-99b0-dafbd8b24bf5","Type":"ContainerStarted","Data":"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf"} Oct 01 16:15:20 crc kubenswrapper[4764]: I1001 16:15:20.996674 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6tr29" event={"ID":"b922335a-fbf4-41ec-99b0-dafbd8b24bf5","Type":"ContainerStarted","Data":"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d"} Oct 01 16:15:20 crc kubenswrapper[4764]: I1001 16:15:20.996838 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-6tr29" Oct 01 16:15:21 crc kubenswrapper[4764]: I1001 16:15:21.048920 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-6tr29" podStartSLOduration=5.678230395 podStartE2EDuration="38.048899755s" podCreationTimestamp="2025-10-01 16:14:43 +0000 UTC" firstStartedPulling="2025-10-01 16:14:43.831999937 +0000 UTC m=+746.831646772" lastFinishedPulling="2025-10-01 16:15:16.202669277 +0000 UTC m=+779.202316132" observedRunningTime="2025-10-01 16:15:21.044508586 +0000 UTC m=+784.044155431" watchObservedRunningTime="2025-10-01 16:15:21.048899755 +0000 UTC m=+784.048546600" Oct 01 16:15:21 crc kubenswrapper[4764]: I1001 16:15:21.914004 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:15:21 crc kubenswrapper[4764]: I1001 16:15:21.914466 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:15:22 crc kubenswrapper[4764]: I1001 16:15:22.004139 4764 generic.go:334] "Generic (PLEG): container finished" podID="9944bc89-7591-4fe0-81a2-41dba5a75f37" containerID="85b9e0664e60748a534d2a5d9a561e4d39327b5484179c19e02de287522f565f" exitCode=0 Oct 01 16:15:22 crc kubenswrapper[4764]: I1001 16:15:22.004255 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" event={"ID":"9944bc89-7591-4fe0-81a2-41dba5a75f37","Type":"ContainerDied","Data":"85b9e0664e60748a534d2a5d9a561e4d39327b5484179c19e02de287522f565f"} Oct 01 16:15:23 crc kubenswrapper[4764]: I1001 16:15:23.013972 4764 generic.go:334] "Generic (PLEG): container finished" podID="9944bc89-7591-4fe0-81a2-41dba5a75f37" containerID="7cb7801f9c6c6b6c44d660f73d2beff1489098388d973d612e5108f9df3c557c" exitCode=0 Oct 01 16:15:23 crc kubenswrapper[4764]: I1001 16:15:23.014204 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" event={"ID":"9944bc89-7591-4fe0-81a2-41dba5a75f37","Type":"ContainerDied","Data":"7cb7801f9c6c6b6c44d660f73d2beff1489098388d973d612e5108f9df3c557c"} Oct 01 16:15:23 crc kubenswrapper[4764]: I1001 16:15:23.678099 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-6tr29" Oct 01 16:15:23 crc kubenswrapper[4764]: I1001 16:15:23.714456 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-6tr29" Oct 01 16:15:24 crc kubenswrapper[4764]: I1001 16:15:24.455755 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" Oct 01 16:15:24 crc kubenswrapper[4764]: I1001 16:15:24.478530 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88jdr\" (UniqueName: \"kubernetes.io/projected/9944bc89-7591-4fe0-81a2-41dba5a75f37-kube-api-access-88jdr\") pod \"9944bc89-7591-4fe0-81a2-41dba5a75f37\" (UID: \"9944bc89-7591-4fe0-81a2-41dba5a75f37\") " Oct 01 16:15:24 crc kubenswrapper[4764]: I1001 16:15:24.478593 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9944bc89-7591-4fe0-81a2-41dba5a75f37-util\") pod \"9944bc89-7591-4fe0-81a2-41dba5a75f37\" (UID: \"9944bc89-7591-4fe0-81a2-41dba5a75f37\") " Oct 01 16:15:24 crc kubenswrapper[4764]: I1001 16:15:24.478616 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9944bc89-7591-4fe0-81a2-41dba5a75f37-bundle\") pod \"9944bc89-7591-4fe0-81a2-41dba5a75f37\" (UID: \"9944bc89-7591-4fe0-81a2-41dba5a75f37\") " Oct 01 16:15:24 crc kubenswrapper[4764]: I1001 16:15:24.479292 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9944bc89-7591-4fe0-81a2-41dba5a75f37-bundle" (OuterVolumeSpecName: "bundle") pod "9944bc89-7591-4fe0-81a2-41dba5a75f37" (UID: "9944bc89-7591-4fe0-81a2-41dba5a75f37"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:15:24 crc kubenswrapper[4764]: I1001 16:15:24.484377 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9944bc89-7591-4fe0-81a2-41dba5a75f37-kube-api-access-88jdr" (OuterVolumeSpecName: "kube-api-access-88jdr") pod "9944bc89-7591-4fe0-81a2-41dba5a75f37" (UID: "9944bc89-7591-4fe0-81a2-41dba5a75f37"). InnerVolumeSpecName "kube-api-access-88jdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:15:24 crc kubenswrapper[4764]: I1001 16:15:24.492503 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9944bc89-7591-4fe0-81a2-41dba5a75f37-util" (OuterVolumeSpecName: "util") pod "9944bc89-7591-4fe0-81a2-41dba5a75f37" (UID: "9944bc89-7591-4fe0-81a2-41dba5a75f37"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:15:24 crc kubenswrapper[4764]: I1001 16:15:24.580371 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88jdr\" (UniqueName: \"kubernetes.io/projected/9944bc89-7591-4fe0-81a2-41dba5a75f37-kube-api-access-88jdr\") on node \"crc\" DevicePath \"\"" Oct 01 16:15:24 crc kubenswrapper[4764]: I1001 16:15:24.580415 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9944bc89-7591-4fe0-81a2-41dba5a75f37-util\") on node \"crc\" DevicePath \"\"" Oct 01 16:15:24 crc kubenswrapper[4764]: I1001 16:15:24.580436 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9944bc89-7591-4fe0-81a2-41dba5a75f37-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.034439 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" event={"ID":"9944bc89-7591-4fe0-81a2-41dba5a75f37","Type":"ContainerDied","Data":"d42511e4a28b42dca6977da6b0a6be33ef3c38c24da1fae8b595e9f39a8fadae"} Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.034503 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d42511e4a28b42dca6977da6b0a6be33ef3c38c24da1fae8b595e9f39a8fadae" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.034503 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.361482 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hjwww"] Oct 01 16:15:25 crc kubenswrapper[4764]: E1001 16:15:25.361855 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9944bc89-7591-4fe0-81a2-41dba5a75f37" containerName="util" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.361882 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9944bc89-7591-4fe0-81a2-41dba5a75f37" containerName="util" Oct 01 16:15:25 crc kubenswrapper[4764]: E1001 16:15:25.361907 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9944bc89-7591-4fe0-81a2-41dba5a75f37" containerName="extract" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.361952 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9944bc89-7591-4fe0-81a2-41dba5a75f37" containerName="extract" Oct 01 16:15:25 crc kubenswrapper[4764]: E1001 16:15:25.361976 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9944bc89-7591-4fe0-81a2-41dba5a75f37" containerName="pull" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.361990 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9944bc89-7591-4fe0-81a2-41dba5a75f37" containerName="pull" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.362235 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9944bc89-7591-4fe0-81a2-41dba5a75f37" containerName="extract" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.363698 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.382017 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hjwww"] Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.393541 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d9b2e56-8e34-44dd-9d50-7c5a66308655-catalog-content\") pod \"community-operators-hjwww\" (UID: \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\") " pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.393610 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d9b2e56-8e34-44dd-9d50-7c5a66308655-utilities\") pod \"community-operators-hjwww\" (UID: \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\") " pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.393636 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfwqf\" (UniqueName: \"kubernetes.io/projected/4d9b2e56-8e34-44dd-9d50-7c5a66308655-kube-api-access-cfwqf\") pod \"community-operators-hjwww\" (UID: \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\") " pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.495142 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d9b2e56-8e34-44dd-9d50-7c5a66308655-utilities\") pod \"community-operators-hjwww\" (UID: \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\") " pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.495202 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfwqf\" (UniqueName: \"kubernetes.io/projected/4d9b2e56-8e34-44dd-9d50-7c5a66308655-kube-api-access-cfwqf\") pod \"community-operators-hjwww\" (UID: \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\") " pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.495321 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d9b2e56-8e34-44dd-9d50-7c5a66308655-catalog-content\") pod \"community-operators-hjwww\" (UID: \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\") " pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.495916 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d9b2e56-8e34-44dd-9d50-7c5a66308655-catalog-content\") pod \"community-operators-hjwww\" (UID: \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\") " pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.496295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d9b2e56-8e34-44dd-9d50-7c5a66308655-utilities\") pod \"community-operators-hjwww\" (UID: \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\") " pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.524067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfwqf\" (UniqueName: \"kubernetes.io/projected/4d9b2e56-8e34-44dd-9d50-7c5a66308655-kube-api-access-cfwqf\") pod \"community-operators-hjwww\" (UID: \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\") " pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:25 crc kubenswrapper[4764]: I1001 16:15:25.684903 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:26 crc kubenswrapper[4764]: I1001 16:15:26.171402 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hjwww"] Oct 01 16:15:26 crc kubenswrapper[4764]: W1001 16:15:26.175298 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d9b2e56_8e34_44dd_9d50_7c5a66308655.slice/crio-5925d3b1912f423e32b5274296b096f8b10add16e06cbfcf37519d63e9909241 WatchSource:0}: Error finding container 5925d3b1912f423e32b5274296b096f8b10add16e06cbfcf37519d63e9909241: Status 404 returned error can't find the container with id 5925d3b1912f423e32b5274296b096f8b10add16e06cbfcf37519d63e9909241 Oct 01 16:15:27 crc kubenswrapper[4764]: I1001 16:15:27.045233 4764 generic.go:334] "Generic (PLEG): container finished" podID="4d9b2e56-8e34-44dd-9d50-7c5a66308655" containerID="40aef06b548b8068aeba19d3ede4b398385c784acc0a2f2829160dd8eb3eb88a" exitCode=0 Oct 01 16:15:27 crc kubenswrapper[4764]: I1001 16:15:27.045316 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjwww" event={"ID":"4d9b2e56-8e34-44dd-9d50-7c5a66308655","Type":"ContainerDied","Data":"40aef06b548b8068aeba19d3ede4b398385c784acc0a2f2829160dd8eb3eb88a"} Oct 01 16:15:27 crc kubenswrapper[4764]: I1001 16:15:27.045539 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjwww" event={"ID":"4d9b2e56-8e34-44dd-9d50-7c5a66308655","Type":"ContainerStarted","Data":"5925d3b1912f423e32b5274296b096f8b10add16e06cbfcf37519d63e9909241"} Oct 01 16:15:28 crc kubenswrapper[4764]: I1001 16:15:28.054830 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjwww" event={"ID":"4d9b2e56-8e34-44dd-9d50-7c5a66308655","Type":"ContainerStarted","Data":"0dbb36eba0f30777c112f64852d7b0d085c3bf25f85dec378b30fc09f4be14e5"} Oct 01 16:15:29 crc kubenswrapper[4764]: I1001 16:15:29.064099 4764 generic.go:334] "Generic (PLEG): container finished" podID="4d9b2e56-8e34-44dd-9d50-7c5a66308655" containerID="0dbb36eba0f30777c112f64852d7b0d085c3bf25f85dec378b30fc09f4be14e5" exitCode=0 Oct 01 16:15:29 crc kubenswrapper[4764]: I1001 16:15:29.064164 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjwww" event={"ID":"4d9b2e56-8e34-44dd-9d50-7c5a66308655","Type":"ContainerDied","Data":"0dbb36eba0f30777c112f64852d7b0d085c3bf25f85dec378b30fc09f4be14e5"} Oct 01 16:15:30 crc kubenswrapper[4764]: I1001 16:15:30.071374 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjwww" event={"ID":"4d9b2e56-8e34-44dd-9d50-7c5a66308655","Type":"ContainerStarted","Data":"d084de03de01554a630a99f4e32851a1b35028f45d25e8fbe2bc42c8c4ed3660"} Oct 01 16:15:30 crc kubenswrapper[4764]: I1001 16:15:30.090810 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hjwww" podStartSLOduration=2.486586412 podStartE2EDuration="5.090795744s" podCreationTimestamp="2025-10-01 16:15:25 +0000 UTC" firstStartedPulling="2025-10-01 16:15:27.047674926 +0000 UTC m=+790.047321761" lastFinishedPulling="2025-10-01 16:15:29.651884218 +0000 UTC m=+792.651531093" observedRunningTime="2025-10-01 16:15:30.088972839 +0000 UTC m=+793.088619684" watchObservedRunningTime="2025-10-01 16:15:30.090795744 +0000 UTC m=+793.090442579" Oct 01 16:15:30 crc kubenswrapper[4764]: I1001 16:15:30.435285 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k"] Oct 01 16:15:30 crc kubenswrapper[4764]: I1001 16:15:30.437439 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k" Oct 01 16:15:30 crc kubenswrapper[4764]: I1001 16:15:30.441116 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-qtmn9" Oct 01 16:15:30 crc kubenswrapper[4764]: I1001 16:15:30.461263 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k"] Oct 01 16:15:30 crc kubenswrapper[4764]: I1001 16:15:30.462024 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xb98\" (UniqueName: \"kubernetes.io/projected/79ca6c6a-0b9f-4122-87ff-4eeb56046125-kube-api-access-7xb98\") pod \"openstack-operator-controller-operator-58f6bc99f-xsx4k\" (UID: \"79ca6c6a-0b9f-4122-87ff-4eeb56046125\") " pod="openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k" Oct 01 16:15:30 crc kubenswrapper[4764]: I1001 16:15:30.563724 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xb98\" (UniqueName: \"kubernetes.io/projected/79ca6c6a-0b9f-4122-87ff-4eeb56046125-kube-api-access-7xb98\") pod \"openstack-operator-controller-operator-58f6bc99f-xsx4k\" (UID: \"79ca6c6a-0b9f-4122-87ff-4eeb56046125\") " pod="openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k" Oct 01 16:15:30 crc kubenswrapper[4764]: I1001 16:15:30.580679 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xb98\" (UniqueName: \"kubernetes.io/projected/79ca6c6a-0b9f-4122-87ff-4eeb56046125-kube-api-access-7xb98\") pod \"openstack-operator-controller-operator-58f6bc99f-xsx4k\" (UID: \"79ca6c6a-0b9f-4122-87ff-4eeb56046125\") " pod="openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k" Oct 01 16:15:30 crc kubenswrapper[4764]: I1001 16:15:30.755559 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k" Oct 01 16:15:31 crc kubenswrapper[4764]: I1001 16:15:31.195641 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k"] Oct 01 16:15:31 crc kubenswrapper[4764]: W1001 16:15:31.206881 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ca6c6a_0b9f_4122_87ff_4eeb56046125.slice/crio-81745adc34f9042d9273c4cf09d303e4e249dfb86e59a324f000d9468db9b3cb WatchSource:0}: Error finding container 81745adc34f9042d9273c4cf09d303e4e249dfb86e59a324f000d9468db9b3cb: Status 404 returned error can't find the container with id 81745adc34f9042d9273c4cf09d303e4e249dfb86e59a324f000d9468db9b3cb Oct 01 16:15:32 crc kubenswrapper[4764]: I1001 16:15:32.087702 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k" event={"ID":"79ca6c6a-0b9f-4122-87ff-4eeb56046125","Type":"ContainerStarted","Data":"81745adc34f9042d9273c4cf09d303e4e249dfb86e59a324f000d9468db9b3cb"} Oct 01 16:15:33 crc kubenswrapper[4764]: I1001 16:15:33.677869 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" Oct 01 16:15:33 crc kubenswrapper[4764]: I1001 16:15:33.681074 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-6tr29" Oct 01 16:15:35 crc kubenswrapper[4764]: I1001 16:15:35.685943 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:35 crc kubenswrapper[4764]: I1001 16:15:35.686848 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:35 crc kubenswrapper[4764]: I1001 16:15:35.754429 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:36 crc kubenswrapper[4764]: I1001 16:15:36.134552 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k" event={"ID":"79ca6c6a-0b9f-4122-87ff-4eeb56046125","Type":"ContainerStarted","Data":"e72a2a0aa7c8fb4f3862158d36173b5594c05fa1cac36d1bc3d9f0a026b18dee"} Oct 01 16:15:36 crc kubenswrapper[4764]: I1001 16:15:36.176745 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:38 crc kubenswrapper[4764]: I1001 16:15:38.144540 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hjwww"] Oct 01 16:15:38 crc kubenswrapper[4764]: I1001 16:15:38.145166 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hjwww" podUID="4d9b2e56-8e34-44dd-9d50-7c5a66308655" containerName="registry-server" containerID="cri-o://d084de03de01554a630a99f4e32851a1b35028f45d25e8fbe2bc42c8c4ed3660" gracePeriod=2 Oct 01 16:15:39 crc kubenswrapper[4764]: I1001 16:15:39.153594 4764 generic.go:334] "Generic (PLEG): container finished" podID="4d9b2e56-8e34-44dd-9d50-7c5a66308655" containerID="d084de03de01554a630a99f4e32851a1b35028f45d25e8fbe2bc42c8c4ed3660" exitCode=0 Oct 01 16:15:39 crc kubenswrapper[4764]: I1001 16:15:39.153672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjwww" event={"ID":"4d9b2e56-8e34-44dd-9d50-7c5a66308655","Type":"ContainerDied","Data":"d084de03de01554a630a99f4e32851a1b35028f45d25e8fbe2bc42c8c4ed3660"} Oct 01 16:15:39 crc kubenswrapper[4764]: I1001 16:15:39.441324 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:39 crc kubenswrapper[4764]: I1001 16:15:39.492089 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfwqf\" (UniqueName: \"kubernetes.io/projected/4d9b2e56-8e34-44dd-9d50-7c5a66308655-kube-api-access-cfwqf\") pod \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\" (UID: \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\") " Oct 01 16:15:39 crc kubenswrapper[4764]: I1001 16:15:39.492169 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d9b2e56-8e34-44dd-9d50-7c5a66308655-catalog-content\") pod \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\" (UID: \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\") " Oct 01 16:15:39 crc kubenswrapper[4764]: I1001 16:15:39.492215 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d9b2e56-8e34-44dd-9d50-7c5a66308655-utilities\") pod \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\" (UID: \"4d9b2e56-8e34-44dd-9d50-7c5a66308655\") " Oct 01 16:15:39 crc kubenswrapper[4764]: I1001 16:15:39.493385 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d9b2e56-8e34-44dd-9d50-7c5a66308655-utilities" (OuterVolumeSpecName: "utilities") pod "4d9b2e56-8e34-44dd-9d50-7c5a66308655" (UID: "4d9b2e56-8e34-44dd-9d50-7c5a66308655"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:15:39 crc kubenswrapper[4764]: I1001 16:15:39.498617 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d9b2e56-8e34-44dd-9d50-7c5a66308655-kube-api-access-cfwqf" (OuterVolumeSpecName: "kube-api-access-cfwqf") pod "4d9b2e56-8e34-44dd-9d50-7c5a66308655" (UID: "4d9b2e56-8e34-44dd-9d50-7c5a66308655"). InnerVolumeSpecName "kube-api-access-cfwqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:15:39 crc kubenswrapper[4764]: I1001 16:15:39.559525 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d9b2e56-8e34-44dd-9d50-7c5a66308655-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d9b2e56-8e34-44dd-9d50-7c5a66308655" (UID: "4d9b2e56-8e34-44dd-9d50-7c5a66308655"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:15:39 crc kubenswrapper[4764]: I1001 16:15:39.594321 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfwqf\" (UniqueName: \"kubernetes.io/projected/4d9b2e56-8e34-44dd-9d50-7c5a66308655-kube-api-access-cfwqf\") on node \"crc\" DevicePath \"\"" Oct 01 16:15:39 crc kubenswrapper[4764]: I1001 16:15:39.594376 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d9b2e56-8e34-44dd-9d50-7c5a66308655-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:15:39 crc kubenswrapper[4764]: I1001 16:15:39.594394 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d9b2e56-8e34-44dd-9d50-7c5a66308655-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:15:40 crc kubenswrapper[4764]: I1001 16:15:40.164555 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hjwww" event={"ID":"4d9b2e56-8e34-44dd-9d50-7c5a66308655","Type":"ContainerDied","Data":"5925d3b1912f423e32b5274296b096f8b10add16e06cbfcf37519d63e9909241"} Oct 01 16:15:40 crc kubenswrapper[4764]: I1001 16:15:40.164638 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hjwww" Oct 01 16:15:40 crc kubenswrapper[4764]: I1001 16:15:40.164908 4764 scope.go:117] "RemoveContainer" containerID="d084de03de01554a630a99f4e32851a1b35028f45d25e8fbe2bc42c8c4ed3660" Oct 01 16:15:40 crc kubenswrapper[4764]: I1001 16:15:40.197827 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hjwww"] Oct 01 16:15:40 crc kubenswrapper[4764]: I1001 16:15:40.205918 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hjwww"] Oct 01 16:15:40 crc kubenswrapper[4764]: I1001 16:15:40.220354 4764 scope.go:117] "RemoveContainer" containerID="0dbb36eba0f30777c112f64852d7b0d085c3bf25f85dec378b30fc09f4be14e5" Oct 01 16:15:40 crc kubenswrapper[4764]: I1001 16:15:40.245696 4764 scope.go:117] "RemoveContainer" containerID="40aef06b548b8068aeba19d3ede4b398385c784acc0a2f2829160dd8eb3eb88a" Oct 01 16:15:41 crc kubenswrapper[4764]: I1001 16:15:41.171760 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k" event={"ID":"79ca6c6a-0b9f-4122-87ff-4eeb56046125","Type":"ContainerStarted","Data":"8ce3ed58f723b95ca0d786181a9e4237f1b81dcd6bb9d0b4d4ce4f08f0c4f1b9"} Oct 01 16:15:41 crc kubenswrapper[4764]: I1001 16:15:41.171975 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k" Oct 01 16:15:41 crc kubenswrapper[4764]: I1001 16:15:41.174013 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k" Oct 01 16:15:41 crc kubenswrapper[4764]: I1001 16:15:41.206175 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-58f6bc99f-xsx4k" podStartSLOduration=2.27909966 podStartE2EDuration="11.206155377s" podCreationTimestamp="2025-10-01 16:15:30 +0000 UTC" firstStartedPulling="2025-10-01 16:15:31.208972737 +0000 UTC m=+794.208619572" lastFinishedPulling="2025-10-01 16:15:40.136028434 +0000 UTC m=+803.135675289" observedRunningTime="2025-10-01 16:15:41.201725787 +0000 UTC m=+804.201372642" watchObservedRunningTime="2025-10-01 16:15:41.206155377 +0000 UTC m=+804.205802232" Oct 01 16:15:41 crc kubenswrapper[4764]: I1001 16:15:41.744154 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d9b2e56-8e34-44dd-9d50-7c5a66308655" path="/var/lib/kubelet/pods/4d9b2e56-8e34-44dd-9d50-7c5a66308655/volumes" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.086842 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qrnc5"] Oct 01 16:15:49 crc kubenswrapper[4764]: E1001 16:15:49.089287 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d9b2e56-8e34-44dd-9d50-7c5a66308655" containerName="extract-utilities" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.089488 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9b2e56-8e34-44dd-9d50-7c5a66308655" containerName="extract-utilities" Oct 01 16:15:49 crc kubenswrapper[4764]: E1001 16:15:49.089645 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d9b2e56-8e34-44dd-9d50-7c5a66308655" containerName="extract-content" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.089763 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9b2e56-8e34-44dd-9d50-7c5a66308655" containerName="extract-content" Oct 01 16:15:49 crc kubenswrapper[4764]: E1001 16:15:49.089907 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d9b2e56-8e34-44dd-9d50-7c5a66308655" containerName="registry-server" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.090026 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d9b2e56-8e34-44dd-9d50-7c5a66308655" containerName="registry-server" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.090347 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d9b2e56-8e34-44dd-9d50-7c5a66308655" containerName="registry-server" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.091903 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.110177 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrnc5"] Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.153608 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a53531-4a05-4107-9461-2e119660fa5b-utilities\") pod \"redhat-operators-qrnc5\" (UID: \"72a53531-4a05-4107-9461-2e119660fa5b\") " pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.153895 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a53531-4a05-4107-9461-2e119660fa5b-catalog-content\") pod \"redhat-operators-qrnc5\" (UID: \"72a53531-4a05-4107-9461-2e119660fa5b\") " pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.154089 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqxkg\" (UniqueName: \"kubernetes.io/projected/72a53531-4a05-4107-9461-2e119660fa5b-kube-api-access-cqxkg\") pod \"redhat-operators-qrnc5\" (UID: \"72a53531-4a05-4107-9461-2e119660fa5b\") " pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.255453 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqxkg\" (UniqueName: \"kubernetes.io/projected/72a53531-4a05-4107-9461-2e119660fa5b-kube-api-access-cqxkg\") pod \"redhat-operators-qrnc5\" (UID: \"72a53531-4a05-4107-9461-2e119660fa5b\") " pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.255630 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a53531-4a05-4107-9461-2e119660fa5b-utilities\") pod \"redhat-operators-qrnc5\" (UID: \"72a53531-4a05-4107-9461-2e119660fa5b\") " pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.255689 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a53531-4a05-4107-9461-2e119660fa5b-catalog-content\") pod \"redhat-operators-qrnc5\" (UID: \"72a53531-4a05-4107-9461-2e119660fa5b\") " pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.256265 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a53531-4a05-4107-9461-2e119660fa5b-utilities\") pod \"redhat-operators-qrnc5\" (UID: \"72a53531-4a05-4107-9461-2e119660fa5b\") " pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.256357 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a53531-4a05-4107-9461-2e119660fa5b-catalog-content\") pod \"redhat-operators-qrnc5\" (UID: \"72a53531-4a05-4107-9461-2e119660fa5b\") " pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.287189 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqxkg\" (UniqueName: \"kubernetes.io/projected/72a53531-4a05-4107-9461-2e119660fa5b-kube-api-access-cqxkg\") pod \"redhat-operators-qrnc5\" (UID: \"72a53531-4a05-4107-9461-2e119660fa5b\") " pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.463296 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:15:49 crc kubenswrapper[4764]: I1001 16:15:49.895543 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrnc5"] Oct 01 16:15:49 crc kubenswrapper[4764]: W1001 16:15:49.904352 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72a53531_4a05_4107_9461_2e119660fa5b.slice/crio-2d74d155d1a906a6676f01166c3764090ae6a085f1b45184dfa7078177023eec WatchSource:0}: Error finding container 2d74d155d1a906a6676f01166c3764090ae6a085f1b45184dfa7078177023eec: Status 404 returned error can't find the container with id 2d74d155d1a906a6676f01166c3764090ae6a085f1b45184dfa7078177023eec Oct 01 16:15:50 crc kubenswrapper[4764]: I1001 16:15:50.226925 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrnc5" event={"ID":"72a53531-4a05-4107-9461-2e119660fa5b","Type":"ContainerStarted","Data":"2d74d155d1a906a6676f01166c3764090ae6a085f1b45184dfa7078177023eec"} Oct 01 16:15:51 crc kubenswrapper[4764]: I1001 16:15:51.234919 4764 generic.go:334] "Generic (PLEG): container finished" podID="72a53531-4a05-4107-9461-2e119660fa5b" containerID="ac110c32a5f8358ef52b13f929860e986bb518cc32a2cf7d2534249ff420f858" exitCode=0 Oct 01 16:15:51 crc kubenswrapper[4764]: I1001 16:15:51.235028 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrnc5" event={"ID":"72a53531-4a05-4107-9461-2e119660fa5b","Type":"ContainerDied","Data":"ac110c32a5f8358ef52b13f929860e986bb518cc32a2cf7d2534249ff420f858"} Oct 01 16:15:51 crc kubenswrapper[4764]: I1001 16:15:51.913957 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:15:51 crc kubenswrapper[4764]: I1001 16:15:51.914036 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:15:53 crc kubenswrapper[4764]: I1001 16:15:53.251592 4764 generic.go:334] "Generic (PLEG): container finished" podID="72a53531-4a05-4107-9461-2e119660fa5b" containerID="ddc87fbeeebb588913746d1191cd3f68cd1093a6d52f8545afcdcd5de29f821d" exitCode=0 Oct 01 16:15:53 crc kubenswrapper[4764]: I1001 16:15:53.251690 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrnc5" event={"ID":"72a53531-4a05-4107-9461-2e119660fa5b","Type":"ContainerDied","Data":"ddc87fbeeebb588913746d1191cd3f68cd1093a6d52f8545afcdcd5de29f821d"} Oct 01 16:15:55 crc kubenswrapper[4764]: I1001 16:15:55.272959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrnc5" event={"ID":"72a53531-4a05-4107-9461-2e119660fa5b","Type":"ContainerStarted","Data":"dc407417ab952e28a781653820d6dc11112b27f7a64e55a82d2838f41d33ef10"} Oct 01 16:15:55 crc kubenswrapper[4764]: I1001 16:15:55.299190 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qrnc5" podStartSLOduration=3.260597462 podStartE2EDuration="6.299157947s" podCreationTimestamp="2025-10-01 16:15:49 +0000 UTC" firstStartedPulling="2025-10-01 16:15:51.237094963 +0000 UTC m=+814.236741818" lastFinishedPulling="2025-10-01 16:15:54.275655458 +0000 UTC m=+817.275302303" observedRunningTime="2025-10-01 16:15:55.290624435 +0000 UTC m=+818.290271320" watchObservedRunningTime="2025-10-01 16:15:55.299157947 +0000 UTC m=+818.298804832" Oct 01 16:15:59 crc kubenswrapper[4764]: I1001 16:15:59.464372 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:15:59 crc kubenswrapper[4764]: I1001 16:15:59.465145 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:16:00 crc kubenswrapper[4764]: I1001 16:16:00.521810 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qrnc5" podUID="72a53531-4a05-4107-9461-2e119660fa5b" containerName="registry-server" probeResult="failure" output=< Oct 01 16:16:00 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Oct 01 16:16:00 crc kubenswrapper[4764]: > Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.254692 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qnqg9"] Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.257323 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.263975 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnqg9"] Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.358760 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-utilities\") pod \"redhat-marketplace-qnqg9\" (UID: \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\") " pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.359027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcnm4\" (UniqueName: \"kubernetes.io/projected/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-kube-api-access-dcnm4\") pod \"redhat-marketplace-qnqg9\" (UID: \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\") " pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.359443 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-catalog-content\") pod \"redhat-marketplace-qnqg9\" (UID: \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\") " pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.461158 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-utilities\") pod \"redhat-marketplace-qnqg9\" (UID: \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\") " pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.461231 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcnm4\" (UniqueName: \"kubernetes.io/projected/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-kube-api-access-dcnm4\") pod \"redhat-marketplace-qnqg9\" (UID: \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\") " pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.461280 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-catalog-content\") pod \"redhat-marketplace-qnqg9\" (UID: \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\") " pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.461669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-utilities\") pod \"redhat-marketplace-qnqg9\" (UID: \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\") " pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.461718 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-catalog-content\") pod \"redhat-marketplace-qnqg9\" (UID: \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\") " pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.486992 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcnm4\" (UniqueName: \"kubernetes.io/projected/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-kube-api-access-dcnm4\") pod \"redhat-marketplace-qnqg9\" (UID: \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\") " pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.583931 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:03 crc kubenswrapper[4764]: I1001 16:16:03.993108 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnqg9"] Oct 01 16:16:04 crc kubenswrapper[4764]: I1001 16:16:04.338368 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnqg9" event={"ID":"9ef9e753-0c90-4529-ab3d-8bb61a44efc7","Type":"ContainerStarted","Data":"b971bcc29496989abe2c000f67e11cca788fa85580d5485832e0fcc577fee425"} Oct 01 16:16:05 crc kubenswrapper[4764]: I1001 16:16:05.351542 4764 generic.go:334] "Generic (PLEG): container finished" podID="9ef9e753-0c90-4529-ab3d-8bb61a44efc7" containerID="0d30c0bf70ec6db8addc82cde59d56140362694c8f50b6e3608a2eb877500aea" exitCode=0 Oct 01 16:16:05 crc kubenswrapper[4764]: I1001 16:16:05.352112 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnqg9" event={"ID":"9ef9e753-0c90-4529-ab3d-8bb61a44efc7","Type":"ContainerDied","Data":"0d30c0bf70ec6db8addc82cde59d56140362694c8f50b6e3608a2eb877500aea"} Oct 01 16:16:07 crc kubenswrapper[4764]: I1001 16:16:07.363936 4764 generic.go:334] "Generic (PLEG): container finished" podID="9ef9e753-0c90-4529-ab3d-8bb61a44efc7" containerID="b4e1ef4b1ffc7b74f512bce045ad80fc8a31817b604e91cd24039dcad98a4690" exitCode=0 Oct 01 16:16:07 crc kubenswrapper[4764]: I1001 16:16:07.364008 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnqg9" event={"ID":"9ef9e753-0c90-4529-ab3d-8bb61a44efc7","Type":"ContainerDied","Data":"b4e1ef4b1ffc7b74f512bce045ad80fc8a31817b604e91cd24039dcad98a4690"} Oct 01 16:16:07 crc kubenswrapper[4764]: I1001 16:16:07.833969 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k6866"] Oct 01 16:16:07 crc kubenswrapper[4764]: I1001 16:16:07.835115 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:07 crc kubenswrapper[4764]: I1001 16:16:07.855921 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6866"] Oct 01 16:16:07 crc kubenswrapper[4764]: I1001 16:16:07.925207 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1709385e-fe6c-443e-a437-ceded08bde5b-catalog-content\") pod \"certified-operators-k6866\" (UID: \"1709385e-fe6c-443e-a437-ceded08bde5b\") " pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:07 crc kubenswrapper[4764]: I1001 16:16:07.925399 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl54v\" (UniqueName: \"kubernetes.io/projected/1709385e-fe6c-443e-a437-ceded08bde5b-kube-api-access-xl54v\") pod \"certified-operators-k6866\" (UID: \"1709385e-fe6c-443e-a437-ceded08bde5b\") " pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:07 crc kubenswrapper[4764]: I1001 16:16:07.925495 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1709385e-fe6c-443e-a437-ceded08bde5b-utilities\") pod \"certified-operators-k6866\" (UID: \"1709385e-fe6c-443e-a437-ceded08bde5b\") " pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:08 crc kubenswrapper[4764]: I1001 16:16:08.026554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1709385e-fe6c-443e-a437-ceded08bde5b-utilities\") pod \"certified-operators-k6866\" (UID: \"1709385e-fe6c-443e-a437-ceded08bde5b\") " pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:08 crc kubenswrapper[4764]: I1001 16:16:08.026644 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1709385e-fe6c-443e-a437-ceded08bde5b-catalog-content\") pod \"certified-operators-k6866\" (UID: \"1709385e-fe6c-443e-a437-ceded08bde5b\") " pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:08 crc kubenswrapper[4764]: I1001 16:16:08.026696 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl54v\" (UniqueName: \"kubernetes.io/projected/1709385e-fe6c-443e-a437-ceded08bde5b-kube-api-access-xl54v\") pod \"certified-operators-k6866\" (UID: \"1709385e-fe6c-443e-a437-ceded08bde5b\") " pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:08 crc kubenswrapper[4764]: I1001 16:16:08.027251 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1709385e-fe6c-443e-a437-ceded08bde5b-utilities\") pod \"certified-operators-k6866\" (UID: \"1709385e-fe6c-443e-a437-ceded08bde5b\") " pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:08 crc kubenswrapper[4764]: I1001 16:16:08.029748 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1709385e-fe6c-443e-a437-ceded08bde5b-catalog-content\") pod \"certified-operators-k6866\" (UID: \"1709385e-fe6c-443e-a437-ceded08bde5b\") " pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:08 crc kubenswrapper[4764]: I1001 16:16:08.054313 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl54v\" (UniqueName: \"kubernetes.io/projected/1709385e-fe6c-443e-a437-ceded08bde5b-kube-api-access-xl54v\") pod \"certified-operators-k6866\" (UID: \"1709385e-fe6c-443e-a437-ceded08bde5b\") " pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:08 crc kubenswrapper[4764]: I1001 16:16:08.150796 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:08 crc kubenswrapper[4764]: I1001 16:16:08.397480 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnqg9" event={"ID":"9ef9e753-0c90-4529-ab3d-8bb61a44efc7","Type":"ContainerStarted","Data":"6bb4aa707b3657e0a32351d88f5994c53d3713990b925d64cb524f0ac0f31b33"} Oct 01 16:16:08 crc kubenswrapper[4764]: I1001 16:16:08.450036 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qnqg9" podStartSLOduration=2.685967293 podStartE2EDuration="5.450020293s" podCreationTimestamp="2025-10-01 16:16:03 +0000 UTC" firstStartedPulling="2025-10-01 16:16:05.355765008 +0000 UTC m=+828.355411853" lastFinishedPulling="2025-10-01 16:16:08.119818018 +0000 UTC m=+831.119464853" observedRunningTime="2025-10-01 16:16:08.43054635 +0000 UTC m=+831.430193185" watchObservedRunningTime="2025-10-01 16:16:08.450020293 +0000 UTC m=+831.449667128" Oct 01 16:16:08 crc kubenswrapper[4764]: I1001 16:16:08.450308 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6866"] Oct 01 16:16:08 crc kubenswrapper[4764]: W1001 16:16:08.457965 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1709385e_fe6c_443e_a437_ceded08bde5b.slice/crio-186d74eb53ab3a5fd6c2019bd27bc2dc2db4ce00bd68b39186be7c9768a6c9f4 WatchSource:0}: Error finding container 186d74eb53ab3a5fd6c2019bd27bc2dc2db4ce00bd68b39186be7c9768a6c9f4: Status 404 returned error can't find the container with id 186d74eb53ab3a5fd6c2019bd27bc2dc2db4ce00bd68b39186be7c9768a6c9f4 Oct 01 16:16:09 crc kubenswrapper[4764]: I1001 16:16:09.403980 4764 generic.go:334] "Generic (PLEG): container finished" podID="1709385e-fe6c-443e-a437-ceded08bde5b" containerID="6c34aa1de934ac638afdbdd3386b8d1aa55aec31a8bc4ea830f5f1922444b16f" exitCode=0 Oct 01 16:16:09 crc kubenswrapper[4764]: I1001 16:16:09.404144 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6866" event={"ID":"1709385e-fe6c-443e-a437-ceded08bde5b","Type":"ContainerDied","Data":"6c34aa1de934ac638afdbdd3386b8d1aa55aec31a8bc4ea830f5f1922444b16f"} Oct 01 16:16:09 crc kubenswrapper[4764]: I1001 16:16:09.405361 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6866" event={"ID":"1709385e-fe6c-443e-a437-ceded08bde5b","Type":"ContainerStarted","Data":"186d74eb53ab3a5fd6c2019bd27bc2dc2db4ce00bd68b39186be7c9768a6c9f4"} Oct 01 16:16:09 crc kubenswrapper[4764]: I1001 16:16:09.517614 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:16:09 crc kubenswrapper[4764]: I1001 16:16:09.568432 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:16:12 crc kubenswrapper[4764]: I1001 16:16:12.231372 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrnc5"] Oct 01 16:16:12 crc kubenswrapper[4764]: I1001 16:16:12.232016 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qrnc5" podUID="72a53531-4a05-4107-9461-2e119660fa5b" containerName="registry-server" containerID="cri-o://dc407417ab952e28a781653820d6dc11112b27f7a64e55a82d2838f41d33ef10" gracePeriod=2 Oct 01 16:16:13 crc kubenswrapper[4764]: I1001 16:16:13.452710 4764 generic.go:334] "Generic (PLEG): container finished" podID="72a53531-4a05-4107-9461-2e119660fa5b" containerID="dc407417ab952e28a781653820d6dc11112b27f7a64e55a82d2838f41d33ef10" exitCode=0 Oct 01 16:16:13 crc kubenswrapper[4764]: I1001 16:16:13.452753 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrnc5" event={"ID":"72a53531-4a05-4107-9461-2e119660fa5b","Type":"ContainerDied","Data":"dc407417ab952e28a781653820d6dc11112b27f7a64e55a82d2838f41d33ef10"} Oct 01 16:16:13 crc kubenswrapper[4764]: I1001 16:16:13.584410 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:13 crc kubenswrapper[4764]: I1001 16:16:13.585286 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:13 crc kubenswrapper[4764]: I1001 16:16:13.647556 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:14 crc kubenswrapper[4764]: I1001 16:16:14.515101 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:15 crc kubenswrapper[4764]: I1001 16:16:15.231913 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnqg9"] Oct 01 16:16:15 crc kubenswrapper[4764]: I1001 16:16:15.552930 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:16:15 crc kubenswrapper[4764]: I1001 16:16:15.638255 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a53531-4a05-4107-9461-2e119660fa5b-utilities\") pod \"72a53531-4a05-4107-9461-2e119660fa5b\" (UID: \"72a53531-4a05-4107-9461-2e119660fa5b\") " Oct 01 16:16:15 crc kubenswrapper[4764]: I1001 16:16:15.638381 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a53531-4a05-4107-9461-2e119660fa5b-catalog-content\") pod \"72a53531-4a05-4107-9461-2e119660fa5b\" (UID: \"72a53531-4a05-4107-9461-2e119660fa5b\") " Oct 01 16:16:15 crc kubenswrapper[4764]: I1001 16:16:15.638416 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqxkg\" (UniqueName: \"kubernetes.io/projected/72a53531-4a05-4107-9461-2e119660fa5b-kube-api-access-cqxkg\") pod \"72a53531-4a05-4107-9461-2e119660fa5b\" (UID: \"72a53531-4a05-4107-9461-2e119660fa5b\") " Oct 01 16:16:15 crc kubenswrapper[4764]: I1001 16:16:15.640735 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a53531-4a05-4107-9461-2e119660fa5b-utilities" (OuterVolumeSpecName: "utilities") pod "72a53531-4a05-4107-9461-2e119660fa5b" (UID: "72a53531-4a05-4107-9461-2e119660fa5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:16:15 crc kubenswrapper[4764]: I1001 16:16:15.648972 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a53531-4a05-4107-9461-2e119660fa5b-kube-api-access-cqxkg" (OuterVolumeSpecName: "kube-api-access-cqxkg") pod "72a53531-4a05-4107-9461-2e119660fa5b" (UID: "72a53531-4a05-4107-9461-2e119660fa5b"). InnerVolumeSpecName "kube-api-access-cqxkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:16:15 crc kubenswrapper[4764]: I1001 16:16:15.739483 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a53531-4a05-4107-9461-2e119660fa5b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:16:15 crc kubenswrapper[4764]: I1001 16:16:15.739798 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqxkg\" (UniqueName: \"kubernetes.io/projected/72a53531-4a05-4107-9461-2e119660fa5b-kube-api-access-cqxkg\") on node \"crc\" DevicePath \"\"" Oct 01 16:16:15 crc kubenswrapper[4764]: I1001 16:16:15.777976 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a53531-4a05-4107-9461-2e119660fa5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72a53531-4a05-4107-9461-2e119660fa5b" (UID: "72a53531-4a05-4107-9461-2e119660fa5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:16:15 crc kubenswrapper[4764]: I1001 16:16:15.840768 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a53531-4a05-4107-9461-2e119660fa5b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.474083 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrnc5" event={"ID":"72a53531-4a05-4107-9461-2e119660fa5b","Type":"ContainerDied","Data":"2d74d155d1a906a6676f01166c3764090ae6a085f1b45184dfa7078177023eec"} Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.474458 4764 scope.go:117] "RemoveContainer" containerID="dc407417ab952e28a781653820d6dc11112b27f7a64e55a82d2838f41d33ef10" Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.474143 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrnc5" Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.480795 4764 generic.go:334] "Generic (PLEG): container finished" podID="1709385e-fe6c-443e-a437-ceded08bde5b" containerID="ce1e5ceabe10bbfc4ab40d80561af9a0d0a192c61769eb3be74fc61e2b195452" exitCode=0 Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.480956 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qnqg9" podUID="9ef9e753-0c90-4529-ab3d-8bb61a44efc7" containerName="registry-server" containerID="cri-o://6bb4aa707b3657e0a32351d88f5994c53d3713990b925d64cb524f0ac0f31b33" gracePeriod=2 Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.481655 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6866" event={"ID":"1709385e-fe6c-443e-a437-ceded08bde5b","Type":"ContainerDied","Data":"ce1e5ceabe10bbfc4ab40d80561af9a0d0a192c61769eb3be74fc61e2b195452"} Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.502736 4764 scope.go:117] "RemoveContainer" containerID="ddc87fbeeebb588913746d1191cd3f68cd1093a6d52f8545afcdcd5de29f821d" Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.539771 4764 scope.go:117] "RemoveContainer" containerID="ac110c32a5f8358ef52b13f929860e986bb518cc32a2cf7d2534249ff420f858" Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.544521 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrnc5"] Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.552626 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qrnc5"] Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.879457 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.958968 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-catalog-content\") pod \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\" (UID: \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\") " Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.959109 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-utilities\") pod \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\" (UID: \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\") " Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.959187 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcnm4\" (UniqueName: \"kubernetes.io/projected/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-kube-api-access-dcnm4\") pod \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\" (UID: \"9ef9e753-0c90-4529-ab3d-8bb61a44efc7\") " Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.959953 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-utilities" (OuterVolumeSpecName: "utilities") pod "9ef9e753-0c90-4529-ab3d-8bb61a44efc7" (UID: "9ef9e753-0c90-4529-ab3d-8bb61a44efc7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.963490 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-kube-api-access-dcnm4" (OuterVolumeSpecName: "kube-api-access-dcnm4") pod "9ef9e753-0c90-4529-ab3d-8bb61a44efc7" (UID: "9ef9e753-0c90-4529-ab3d-8bb61a44efc7"). InnerVolumeSpecName "kube-api-access-dcnm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:16:16 crc kubenswrapper[4764]: I1001 16:16:16.973720 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ef9e753-0c90-4529-ab3d-8bb61a44efc7" (UID: "9ef9e753-0c90-4529-ab3d-8bb61a44efc7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.061239 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.061290 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcnm4\" (UniqueName: \"kubernetes.io/projected/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-kube-api-access-dcnm4\") on node \"crc\" DevicePath \"\"" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.061304 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef9e753-0c90-4529-ab3d-8bb61a44efc7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.489025 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k6866" event={"ID":"1709385e-fe6c-443e-a437-ceded08bde5b","Type":"ContainerStarted","Data":"fab94c6c34af3e06501a5fb6a4673e85d6fcb5f5eef22547ad88a11b2ba06b39"} Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.491729 4764 generic.go:334] "Generic (PLEG): container finished" podID="9ef9e753-0c90-4529-ab3d-8bb61a44efc7" containerID="6bb4aa707b3657e0a32351d88f5994c53d3713990b925d64cb524f0ac0f31b33" exitCode=0 Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.491756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnqg9" event={"ID":"9ef9e753-0c90-4529-ab3d-8bb61a44efc7","Type":"ContainerDied","Data":"6bb4aa707b3657e0a32351d88f5994c53d3713990b925d64cb524f0ac0f31b33"} Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.491773 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qnqg9" event={"ID":"9ef9e753-0c90-4529-ab3d-8bb61a44efc7","Type":"ContainerDied","Data":"b971bcc29496989abe2c000f67e11cca788fa85580d5485832e0fcc577fee425"} Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.491797 4764 scope.go:117] "RemoveContainer" containerID="6bb4aa707b3657e0a32351d88f5994c53d3713990b925d64cb524f0ac0f31b33" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.491862 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qnqg9" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.508187 4764 scope.go:117] "RemoveContainer" containerID="b4e1ef4b1ffc7b74f512bce045ad80fc8a31817b604e91cd24039dcad98a4690" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.515835 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k6866" podStartSLOduration=2.698462501 podStartE2EDuration="10.515811204s" podCreationTimestamp="2025-10-01 16:16:07 +0000 UTC" firstStartedPulling="2025-10-01 16:16:09.405905748 +0000 UTC m=+832.405552593" lastFinishedPulling="2025-10-01 16:16:17.223254441 +0000 UTC m=+840.222901296" observedRunningTime="2025-10-01 16:16:17.515523667 +0000 UTC m=+840.515170502" watchObservedRunningTime="2025-10-01 16:16:17.515811204 +0000 UTC m=+840.515458049" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.532585 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnqg9"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.536888 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qnqg9"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.538228 4764 scope.go:117] "RemoveContainer" containerID="0d30c0bf70ec6db8addc82cde59d56140362694c8f50b6e3608a2eb877500aea" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.575264 4764 scope.go:117] "RemoveContainer" containerID="6bb4aa707b3657e0a32351d88f5994c53d3713990b925d64cb524f0ac0f31b33" Oct 01 16:16:17 crc kubenswrapper[4764]: E1001 16:16:17.575760 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb4aa707b3657e0a32351d88f5994c53d3713990b925d64cb524f0ac0f31b33\": container with ID starting with 6bb4aa707b3657e0a32351d88f5994c53d3713990b925d64cb524f0ac0f31b33 not found: ID does not exist" containerID="6bb4aa707b3657e0a32351d88f5994c53d3713990b925d64cb524f0ac0f31b33" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.575797 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb4aa707b3657e0a32351d88f5994c53d3713990b925d64cb524f0ac0f31b33"} err="failed to get container status \"6bb4aa707b3657e0a32351d88f5994c53d3713990b925d64cb524f0ac0f31b33\": rpc error: code = NotFound desc = could not find container \"6bb4aa707b3657e0a32351d88f5994c53d3713990b925d64cb524f0ac0f31b33\": container with ID starting with 6bb4aa707b3657e0a32351d88f5994c53d3713990b925d64cb524f0ac0f31b33 not found: ID does not exist" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.575827 4764 scope.go:117] "RemoveContainer" containerID="b4e1ef4b1ffc7b74f512bce045ad80fc8a31817b604e91cd24039dcad98a4690" Oct 01 16:16:17 crc kubenswrapper[4764]: E1001 16:16:17.576029 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4e1ef4b1ffc7b74f512bce045ad80fc8a31817b604e91cd24039dcad98a4690\": container with ID starting with b4e1ef4b1ffc7b74f512bce045ad80fc8a31817b604e91cd24039dcad98a4690 not found: ID does not exist" containerID="b4e1ef4b1ffc7b74f512bce045ad80fc8a31817b604e91cd24039dcad98a4690" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.576087 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4e1ef4b1ffc7b74f512bce045ad80fc8a31817b604e91cd24039dcad98a4690"} err="failed to get container status \"b4e1ef4b1ffc7b74f512bce045ad80fc8a31817b604e91cd24039dcad98a4690\": rpc error: code = NotFound desc = could not find container \"b4e1ef4b1ffc7b74f512bce045ad80fc8a31817b604e91cd24039dcad98a4690\": container with ID starting with b4e1ef4b1ffc7b74f512bce045ad80fc8a31817b604e91cd24039dcad98a4690 not found: ID does not exist" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.576107 4764 scope.go:117] "RemoveContainer" containerID="0d30c0bf70ec6db8addc82cde59d56140362694c8f50b6e3608a2eb877500aea" Oct 01 16:16:17 crc kubenswrapper[4764]: E1001 16:16:17.576415 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d30c0bf70ec6db8addc82cde59d56140362694c8f50b6e3608a2eb877500aea\": container with ID starting with 0d30c0bf70ec6db8addc82cde59d56140362694c8f50b6e3608a2eb877500aea not found: ID does not exist" containerID="0d30c0bf70ec6db8addc82cde59d56140362694c8f50b6e3608a2eb877500aea" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.576443 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d30c0bf70ec6db8addc82cde59d56140362694c8f50b6e3608a2eb877500aea"} err="failed to get container status \"0d30c0bf70ec6db8addc82cde59d56140362694c8f50b6e3608a2eb877500aea\": rpc error: code = NotFound desc = could not find container \"0d30c0bf70ec6db8addc82cde59d56140362694c8f50b6e3608a2eb877500aea\": container with ID starting with 0d30c0bf70ec6db8addc82cde59d56140362694c8f50b6e3608a2eb877500aea not found: ID does not exist" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.644707 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b"] Oct 01 16:16:17 crc kubenswrapper[4764]: E1001 16:16:17.644936 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a53531-4a05-4107-9461-2e119660fa5b" containerName="extract-content" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.644948 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a53531-4a05-4107-9461-2e119660fa5b" containerName="extract-content" Oct 01 16:16:17 crc kubenswrapper[4764]: E1001 16:16:17.644955 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef9e753-0c90-4529-ab3d-8bb61a44efc7" containerName="extract-utilities" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.644962 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef9e753-0c90-4529-ab3d-8bb61a44efc7" containerName="extract-utilities" Oct 01 16:16:17 crc kubenswrapper[4764]: E1001 16:16:17.644973 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a53531-4a05-4107-9461-2e119660fa5b" containerName="registry-server" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.644979 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a53531-4a05-4107-9461-2e119660fa5b" containerName="registry-server" Oct 01 16:16:17 crc kubenswrapper[4764]: E1001 16:16:17.644991 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef9e753-0c90-4529-ab3d-8bb61a44efc7" containerName="registry-server" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.644997 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef9e753-0c90-4529-ab3d-8bb61a44efc7" containerName="registry-server" Oct 01 16:16:17 crc kubenswrapper[4764]: E1001 16:16:17.645009 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a53531-4a05-4107-9461-2e119660fa5b" containerName="extract-utilities" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.645015 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a53531-4a05-4107-9461-2e119660fa5b" containerName="extract-utilities" Oct 01 16:16:17 crc kubenswrapper[4764]: E1001 16:16:17.645022 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef9e753-0c90-4529-ab3d-8bb61a44efc7" containerName="extract-content" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.645027 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef9e753-0c90-4529-ab3d-8bb61a44efc7" containerName="extract-content" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.645154 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef9e753-0c90-4529-ab3d-8bb61a44efc7" containerName="registry-server" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.645167 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a53531-4a05-4107-9461-2e119660fa5b" containerName="registry-server" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.645705 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.647358 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vqm4t" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.652581 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.653608 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.655032 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-qdgfn" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.667756 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.672239 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.687856 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.688761 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.690606 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-csglr" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.698255 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.700323 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.703543 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-n9dbd" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.719975 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.735520 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a53531-4a05-4107-9461-2e119660fa5b" path="/var/lib/kubelet/pods/72a53531-4a05-4107-9461-2e119660fa5b/volumes" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.736171 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef9e753-0c90-4529-ab3d-8bb61a44efc7" path="/var/lib/kubelet/pods/9ef9e753-0c90-4529-ab3d-8bb61a44efc7/volumes" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.736713 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.770201 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.775499 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.806123 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mnwl\" (UniqueName: \"kubernetes.io/projected/af86c8bd-6b9f-4cf1-8ffc-d441a90f25fd-kube-api-access-9mnwl\") pod \"barbican-operator-controller-manager-6ff8b75857-4724b\" (UID: \"af86c8bd-6b9f-4cf1-8ffc-d441a90f25fd\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.806212 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvt9r\" (UniqueName: \"kubernetes.io/projected/74ebea6a-ca87-4ccb-ab25-3c4899c04d39-kube-api-access-kvt9r\") pod \"cinder-operator-controller-manager-644bddb6d8-lrzn4\" (UID: \"74ebea6a-ca87-4ccb-ab25-3c4899c04d39\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.806249 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7km5\" (UniqueName: \"kubernetes.io/projected/0fff9cd4-9690-4a70-a578-0eadbcbb47d6-kube-api-access-r7km5\") pod \"glance-operator-controller-manager-84958c4d49-vnmvt\" (UID: \"0fff9cd4-9690-4a70-a578-0eadbcbb47d6\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.806285 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjxvb\" (UniqueName: \"kubernetes.io/projected/0fe3c02d-9e92-4628-8d94-6797d56fe480-kube-api-access-rjxvb\") pod \"designate-operator-controller-manager-84f4f7b77b-27n67\" (UID: \"0fe3c02d-9e92-4628-8d94-6797d56fe480\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.809027 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.811356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.813525 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.822438 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-2vcsc" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.822669 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-gjwqg" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.844151 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.845332 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.849285 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.849440 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-lltrs" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.850809 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.851063 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.853984 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-fflcb" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.870251 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.883814 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.888368 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.903953 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.905093 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.910178 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mnwl\" (UniqueName: \"kubernetes.io/projected/af86c8bd-6b9f-4cf1-8ffc-d441a90f25fd-kube-api-access-9mnwl\") pod \"barbican-operator-controller-manager-6ff8b75857-4724b\" (UID: \"af86c8bd-6b9f-4cf1-8ffc-d441a90f25fd\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.910222 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgqmc\" (UniqueName: \"kubernetes.io/projected/323e4260-1016-4601-a8c1-f75641230fdb-kube-api-access-qgqmc\") pod \"horizon-operator-controller-manager-9f4696d94-g528r\" (UID: \"323e4260-1016-4601-a8c1-f75641230fdb\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.910260 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvt9r\" (UniqueName: \"kubernetes.io/projected/74ebea6a-ca87-4ccb-ab25-3c4899c04d39-kube-api-access-kvt9r\") pod \"cinder-operator-controller-manager-644bddb6d8-lrzn4\" (UID: \"74ebea6a-ca87-4ccb-ab25-3c4899c04d39\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.910284 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7km5\" (UniqueName: \"kubernetes.io/projected/0fff9cd4-9690-4a70-a578-0eadbcbb47d6-kube-api-access-r7km5\") pod \"glance-operator-controller-manager-84958c4d49-vnmvt\" (UID: \"0fff9cd4-9690-4a70-a578-0eadbcbb47d6\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.910310 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czfs4\" (UniqueName: \"kubernetes.io/projected/9973c37b-d58f-48b0-8c1e-707576e2cb09-kube-api-access-czfs4\") pod \"heat-operator-controller-manager-5d889d78cf-9vv9s\" (UID: \"9973c37b-d58f-48b0-8c1e-707576e2cb09\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.910339 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjxvb\" (UniqueName: \"kubernetes.io/projected/0fe3c02d-9e92-4628-8d94-6797d56fe480-kube-api-access-rjxvb\") pod \"designate-operator-controller-manager-84f4f7b77b-27n67\" (UID: \"0fe3c02d-9e92-4628-8d94-6797d56fe480\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.911445 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-n2mgw" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.914128 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.915487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.923541 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-4m67m" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.944292 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.948793 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjxvb\" (UniqueName: \"kubernetes.io/projected/0fe3c02d-9e92-4628-8d94-6797d56fe480-kube-api-access-rjxvb\") pod \"designate-operator-controller-manager-84f4f7b77b-27n67\" (UID: \"0fe3c02d-9e92-4628-8d94-6797d56fe480\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.949923 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.955533 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mnwl\" (UniqueName: \"kubernetes.io/projected/af86c8bd-6b9f-4cf1-8ffc-d441a90f25fd-kube-api-access-9mnwl\") pod \"barbican-operator-controller-manager-6ff8b75857-4724b\" (UID: \"af86c8bd-6b9f-4cf1-8ffc-d441a90f25fd\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.968292 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vqm4t" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.970320 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-lms99"] Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.971347 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-lms99" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.975632 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvt9r\" (UniqueName: \"kubernetes.io/projected/74ebea6a-ca87-4ccb-ab25-3c4899c04d39-kube-api-access-kvt9r\") pod \"cinder-operator-controller-manager-644bddb6d8-lrzn4\" (UID: \"74ebea6a-ca87-4ccb-ab25-3c4899c04d39\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.975709 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.976660 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7km5\" (UniqueName: \"kubernetes.io/projected/0fff9cd4-9690-4a70-a578-0eadbcbb47d6-kube-api-access-r7km5\") pod \"glance-operator-controller-manager-84958c4d49-vnmvt\" (UID: \"0fff9cd4-9690-4a70-a578-0eadbcbb47d6\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt" Oct 01 16:16:17 crc kubenswrapper[4764]: I1001 16:16:17.977234 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-zp5zn" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.008362 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.013718 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bldr\" (UniqueName: \"kubernetes.io/projected/c535869c-c448-4bea-944d-fce55ddd334c-kube-api-access-5bldr\") pod \"ironic-operator-controller-manager-5cd4858477-62xn4\" (UID: \"c535869c-c448-4bea-944d-fce55ddd334c\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.013760 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czfs4\" (UniqueName: \"kubernetes.io/projected/9973c37b-d58f-48b0-8c1e-707576e2cb09-kube-api-access-czfs4\") pod \"heat-operator-controller-manager-5d889d78cf-9vv9s\" (UID: \"9973c37b-d58f-48b0-8c1e-707576e2cb09\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.013781 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5x6p\" (UniqueName: \"kubernetes.io/projected/67c0305d-d391-4892-869d-f5702a69cc45-kube-api-access-w5x6p\") pod \"keystone-operator-controller-manager-5bd55b4bff-fg9hc\" (UID: \"67c0305d-d391-4892-869d-f5702a69cc45\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.013826 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c17b6bf-0c13-491a-977a-95566d56d7c4-cert\") pod \"infra-operator-controller-manager-9d6c5db85-bwj7p\" (UID: \"8c17b6bf-0c13-491a-977a-95566d56d7c4\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.013846 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l2gn\" (UniqueName: \"kubernetes.io/projected/8c17b6bf-0c13-491a-977a-95566d56d7c4-kube-api-access-2l2gn\") pod \"infra-operator-controller-manager-9d6c5db85-bwj7p\" (UID: \"8c17b6bf-0c13-491a-977a-95566d56d7c4\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.013869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjtqt\" (UniqueName: \"kubernetes.io/projected/db117cfb-3e46-4428-93b1-44a66101c57d-kube-api-access-cjtqt\") pod \"manila-operator-controller-manager-5b67755477-8xdpz\" (UID: \"db117cfb-3e46-4428-93b1-44a66101c57d\") " pod="openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.013898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgqmc\" (UniqueName: \"kubernetes.io/projected/323e4260-1016-4601-a8c1-f75641230fdb-kube-api-access-qgqmc\") pod \"horizon-operator-controller-manager-9f4696d94-g528r\" (UID: \"323e4260-1016-4601-a8c1-f75641230fdb\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.014526 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.015462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.023768 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-sh8tw" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.024097 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-lms99"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.035720 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.042165 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.059124 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.060344 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.062433 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-qfmwn" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.065667 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czfs4\" (UniqueName: \"kubernetes.io/projected/9973c37b-d58f-48b0-8c1e-707576e2cb09-kube-api-access-czfs4\") pod \"heat-operator-controller-manager-5d889d78cf-9vv9s\" (UID: \"9973c37b-d58f-48b0-8c1e-707576e2cb09\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.070942 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgqmc\" (UniqueName: \"kubernetes.io/projected/323e4260-1016-4601-a8c1-f75641230fdb-kube-api-access-qgqmc\") pod \"horizon-operator-controller-manager-9f4696d94-g528r\" (UID: \"323e4260-1016-4601-a8c1-f75641230fdb\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.092860 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.108033 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.109146 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.112390 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.116561 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c17b6bf-0c13-491a-977a-95566d56d7c4-cert\") pod \"infra-operator-controller-manager-9d6c5db85-bwj7p\" (UID: \"8c17b6bf-0c13-491a-977a-95566d56d7c4\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.116593 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l2gn\" (UniqueName: \"kubernetes.io/projected/8c17b6bf-0c13-491a-977a-95566d56d7c4-kube-api-access-2l2gn\") pod \"infra-operator-controller-manager-9d6c5db85-bwj7p\" (UID: \"8c17b6bf-0c13-491a-977a-95566d56d7c4\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.116716 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjtqt\" (UniqueName: \"kubernetes.io/projected/db117cfb-3e46-4428-93b1-44a66101c57d-kube-api-access-cjtqt\") pod \"manila-operator-controller-manager-5b67755477-8xdpz\" (UID: \"db117cfb-3e46-4428-93b1-44a66101c57d\") " pod="openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.116852 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6hfc\" (UniqueName: \"kubernetes.io/projected/e4b31c01-ec06-434e-af2a-228a1ee7ec19-kube-api-access-z6hfc\") pod \"mariadb-operator-controller-manager-88c7-lms99\" (UID: \"e4b31c01-ec06-434e-af2a-228a1ee7ec19\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-lms99" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.116880 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlhk4\" (UniqueName: \"kubernetes.io/projected/53b1bb68-341f-4635-8339-ff10c9b08dee-kube-api-access-qlhk4\") pod \"neutron-operator-controller-manager-849d5b9b84-z9988\" (UID: \"53b1bb68-341f-4635-8339-ff10c9b08dee\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.117010 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bldr\" (UniqueName: \"kubernetes.io/projected/c535869c-c448-4bea-944d-fce55ddd334c-kube-api-access-5bldr\") pod \"ironic-operator-controller-manager-5cd4858477-62xn4\" (UID: \"c535869c-c448-4bea-944d-fce55ddd334c\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.117032 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5x6p\" (UniqueName: \"kubernetes.io/projected/67c0305d-d391-4892-869d-f5702a69cc45-kube-api-access-w5x6p\") pod \"keystone-operator-controller-manager-5bd55b4bff-fg9hc\" (UID: \"67c0305d-d391-4892-869d-f5702a69cc45\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc" Oct 01 16:16:18 crc kubenswrapper[4764]: E1001 16:16:18.118038 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 01 16:16:18 crc kubenswrapper[4764]: E1001 16:16:18.118097 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c17b6bf-0c13-491a-977a-95566d56d7c4-cert podName:8c17b6bf-0c13-491a-977a-95566d56d7c4 nodeName:}" failed. No retries permitted until 2025-10-01 16:16:18.618082785 +0000 UTC m=+841.617729620 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c17b6bf-0c13-491a-977a-95566d56d7c4-cert") pod "infra-operator-controller-manager-9d6c5db85-bwj7p" (UID: "8c17b6bf-0c13-491a-977a-95566d56d7c4") : secret "infra-operator-webhook-server-cert" not found Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.131040 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.141605 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-lj7h2" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.148712 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.152827 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.152883 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.156592 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.171983 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.174221 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-m8csc" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.182150 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.228677 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5mqq\" (UniqueName: \"kubernetes.io/projected/394ffa3b-3cd6-4deb-a436-624fa75155a2-kube-api-access-s5mqq\") pod \"nova-operator-controller-manager-64cd67b5cb-59r2r\" (UID: \"394ffa3b-3cd6-4deb-a436-624fa75155a2\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.228723 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbj7t\" (UniqueName: \"kubernetes.io/projected/2050b8cd-91c1-4962-b346-bbfa5c4e652e-kube-api-access-kbj7t\") pod \"octavia-operator-controller-manager-7b787867f4-vnlvl\" (UID: \"2050b8cd-91c1-4962-b346-bbfa5c4e652e\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.228816 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6hfc\" (UniqueName: \"kubernetes.io/projected/e4b31c01-ec06-434e-af2a-228a1ee7ec19-kube-api-access-z6hfc\") pod \"mariadb-operator-controller-manager-88c7-lms99\" (UID: \"e4b31c01-ec06-434e-af2a-228a1ee7ec19\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-lms99" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.228843 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlhk4\" (UniqueName: \"kubernetes.io/projected/53b1bb68-341f-4635-8339-ff10c9b08dee-kube-api-access-qlhk4\") pod \"neutron-operator-controller-manager-849d5b9b84-z9988\" (UID: \"53b1bb68-341f-4635-8339-ff10c9b08dee\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.234908 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.237371 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.237763 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-7c7wq" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.266863 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l2gn\" (UniqueName: \"kubernetes.io/projected/8c17b6bf-0c13-491a-977a-95566d56d7c4-kube-api-access-2l2gn\") pod \"infra-operator-controller-manager-9d6c5db85-bwj7p\" (UID: \"8c17b6bf-0c13-491a-977a-95566d56d7c4\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.283938 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-qdgfn" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.293598 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.301480 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjtqt\" (UniqueName: \"kubernetes.io/projected/db117cfb-3e46-4428-93b1-44a66101c57d-kube-api-access-cjtqt\") pod \"manila-operator-controller-manager-5b67755477-8xdpz\" (UID: \"db117cfb-3e46-4428-93b1-44a66101c57d\") " pod="openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.317780 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.325587 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bldr\" (UniqueName: \"kubernetes.io/projected/c535869c-c448-4bea-944d-fce55ddd334c-kube-api-access-5bldr\") pod \"ironic-operator-controller-manager-5cd4858477-62xn4\" (UID: \"c535869c-c448-4bea-944d-fce55ddd334c\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.327550 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.329822 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.332222 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5mqq\" (UniqueName: \"kubernetes.io/projected/394ffa3b-3cd6-4deb-a436-624fa75155a2-kube-api-access-s5mqq\") pod \"nova-operator-controller-manager-64cd67b5cb-59r2r\" (UID: \"394ffa3b-3cd6-4deb-a436-624fa75155a2\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.332273 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbj7t\" (UniqueName: \"kubernetes.io/projected/2050b8cd-91c1-4962-b346-bbfa5c4e652e-kube-api-access-kbj7t\") pod \"octavia-operator-controller-manager-7b787867f4-vnlvl\" (UID: \"2050b8cd-91c1-4962-b346-bbfa5c4e652e\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.332338 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba9b6db9-115e-4760-aef3-107976da810e-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6\" (UID: \"ba9b6db9-115e-4760-aef3-107976da810e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.332391 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb7wz\" (UniqueName: \"kubernetes.io/projected/ba9b6db9-115e-4760-aef3-107976da810e-kube-api-access-cb7wz\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6\" (UID: \"ba9b6db9-115e-4760-aef3-107976da810e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.332443 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b4rx\" (UniqueName: \"kubernetes.io/projected/33cb2692-6fcf-4af5-bf43-697a4a740c19-kube-api-access-7b4rx\") pod \"ovn-operator-controller-manager-9976ff44c-rkr2b\" (UID: \"33cb2692-6fcf-4af5-bf43-697a4a740c19\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.340568 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-4xdtj" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.342176 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.342619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlhk4\" (UniqueName: \"kubernetes.io/projected/53b1bb68-341f-4635-8339-ff10c9b08dee-kube-api-access-qlhk4\") pod \"neutron-operator-controller-manager-849d5b9b84-z9988\" (UID: \"53b1bb68-341f-4635-8339-ff10c9b08dee\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.350015 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5x6p\" (UniqueName: \"kubernetes.io/projected/67c0305d-d391-4892-869d-f5702a69cc45-kube-api-access-w5x6p\") pod \"keystone-operator-controller-manager-5bd55b4bff-fg9hc\" (UID: \"67c0305d-d391-4892-869d-f5702a69cc45\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.351099 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.352064 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6hfc\" (UniqueName: \"kubernetes.io/projected/e4b31c01-ec06-434e-af2a-228a1ee7ec19-kube-api-access-z6hfc\") pod \"mariadb-operator-controller-manager-88c7-lms99\" (UID: \"e4b31c01-ec06-434e-af2a-228a1ee7ec19\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-lms99" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.354032 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.356978 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.358112 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.363558 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-fs4bn" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.364476 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-ws2d8" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.365277 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.367785 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5mqq\" (UniqueName: \"kubernetes.io/projected/394ffa3b-3cd6-4deb-a436-624fa75155a2-kube-api-access-s5mqq\") pod \"nova-operator-controller-manager-64cd67b5cb-59r2r\" (UID: \"394ffa3b-3cd6-4deb-a436-624fa75155a2\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.373277 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.375023 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-lms99" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.383124 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbj7t\" (UniqueName: \"kubernetes.io/projected/2050b8cd-91c1-4962-b346-bbfa5c4e652e-kube-api-access-kbj7t\") pod \"octavia-operator-controller-manager-7b787867f4-vnlvl\" (UID: \"2050b8cd-91c1-4962-b346-bbfa5c4e652e\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.396560 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-q6nwl"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.400108 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.401406 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-lvcvd" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.402882 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.409390 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.412812 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-q6nwl"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.416594 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.428104 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.433076 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-tbldm" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.433579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md87n\" (UniqueName: \"kubernetes.io/projected/705820be-248f-49fb-9ace-0f333674985a-kube-api-access-md87n\") pod \"swift-operator-controller-manager-84d6b4b759-vbgxb\" (UID: \"705820be-248f-49fb-9ace-0f333674985a\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.433632 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b4rx\" (UniqueName: \"kubernetes.io/projected/33cb2692-6fcf-4af5-bf43-697a4a740c19-kube-api-access-7b4rx\") pod \"ovn-operator-controller-manager-9976ff44c-rkr2b\" (UID: \"33cb2692-6fcf-4af5-bf43-697a4a740c19\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.433680 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trnrn\" (UniqueName: \"kubernetes.io/projected/10d8bbe9-a54f-489a-8fdc-a6acf5b6a46b-kube-api-access-trnrn\") pod \"telemetry-operator-controller-manager-b8d54b5d7-8wqsk\" (UID: \"10d8bbe9-a54f-489a-8fdc-a6acf5b6a46b\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.433727 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba9b6db9-115e-4760-aef3-107976da810e-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6\" (UID: \"ba9b6db9-115e-4760-aef3-107976da810e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.433747 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj92k\" (UniqueName: \"kubernetes.io/projected/d591342f-60e7-48db-9073-b2d6e9fe6992-kube-api-access-hj92k\") pod \"placement-operator-controller-manager-589c58c6c-jlwbd\" (UID: \"d591342f-60e7-48db-9073-b2d6e9fe6992\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.433786 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb7wz\" (UniqueName: \"kubernetes.io/projected/ba9b6db9-115e-4760-aef3-107976da810e-kube-api-access-cb7wz\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6\" (UID: \"ba9b6db9-115e-4760-aef3-107976da810e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" Oct 01 16:16:18 crc kubenswrapper[4764]: E1001 16:16:18.433965 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 16:16:18 crc kubenswrapper[4764]: E1001 16:16:18.434036 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba9b6db9-115e-4760-aef3-107976da810e-cert podName:ba9b6db9-115e-4760-aef3-107976da810e nodeName:}" failed. No retries permitted until 2025-10-01 16:16:18.934019627 +0000 UTC m=+841.933666462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba9b6db9-115e-4760-aef3-107976da810e-cert") pod "openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" (UID: "ba9b6db9-115e-4760-aef3-107976da810e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.435505 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.478640 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.481208 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb7wz\" (UniqueName: \"kubernetes.io/projected/ba9b6db9-115e-4760-aef3-107976da810e-kube-api-access-cb7wz\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6\" (UID: \"ba9b6db9-115e-4760-aef3-107976da810e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.486841 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b4rx\" (UniqueName: \"kubernetes.io/projected/33cb2692-6fcf-4af5-bf43-697a4a740c19-kube-api-access-7b4rx\") pod \"ovn-operator-controller-manager-9976ff44c-rkr2b\" (UID: \"33cb2692-6fcf-4af5-bf43-697a4a740c19\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.502753 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.537691 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj92k\" (UniqueName: \"kubernetes.io/projected/d591342f-60e7-48db-9073-b2d6e9fe6992-kube-api-access-hj92k\") pod \"placement-operator-controller-manager-589c58c6c-jlwbd\" (UID: \"d591342f-60e7-48db-9073-b2d6e9fe6992\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.537771 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md87n\" (UniqueName: \"kubernetes.io/projected/705820be-248f-49fb-9ace-0f333674985a-kube-api-access-md87n\") pod \"swift-operator-controller-manager-84d6b4b759-vbgxb\" (UID: \"705820be-248f-49fb-9ace-0f333674985a\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.537800 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ssmp\" (UniqueName: \"kubernetes.io/projected/29eab2f9-7ef6-4cc6-9f45-af32a4071a5d-kube-api-access-9ssmp\") pod \"watcher-operator-controller-manager-6b9957f54f-f5nmv\" (UID: \"29eab2f9-7ef6-4cc6-9f45-af32a4071a5d\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.537848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kzdd\" (UniqueName: \"kubernetes.io/projected/e55f7c89-8011-437e-bcbc-b19ae9e25acd-kube-api-access-8kzdd\") pod \"test-operator-controller-manager-85777745bb-q6nwl\" (UID: \"e55f7c89-8011-437e-bcbc-b19ae9e25acd\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.537868 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trnrn\" (UniqueName: \"kubernetes.io/projected/10d8bbe9-a54f-489a-8fdc-a6acf5b6a46b-kube-api-access-trnrn\") pod \"telemetry-operator-controller-manager-b8d54b5d7-8wqsk\" (UID: \"10d8bbe9-a54f-489a-8fdc-a6acf5b6a46b\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.538461 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.539094 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.549831 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.551172 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.559554 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.564688 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-lqmlt" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.564622 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.568398 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.570321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md87n\" (UniqueName: \"kubernetes.io/projected/705820be-248f-49fb-9ace-0f333674985a-kube-api-access-md87n\") pod \"swift-operator-controller-manager-84d6b4b759-vbgxb\" (UID: \"705820be-248f-49fb-9ace-0f333674985a\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.580289 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trnrn\" (UniqueName: \"kubernetes.io/projected/10d8bbe9-a54f-489a-8fdc-a6acf5b6a46b-kube-api-access-trnrn\") pod \"telemetry-operator-controller-manager-b8d54b5d7-8wqsk\" (UID: \"10d8bbe9-a54f-489a-8fdc-a6acf5b6a46b\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.588397 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj92k\" (UniqueName: \"kubernetes.io/projected/d591342f-60e7-48db-9073-b2d6e9fe6992-kube-api-access-hj92k\") pod \"placement-operator-controller-manager-589c58c6c-jlwbd\" (UID: \"d591342f-60e7-48db-9073-b2d6e9fe6992\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.614623 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.615865 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.618196 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-klqvk" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.622728 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.639558 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0775c44c-131f-4a9c-89d5-bd724765e310-cert\") pod \"openstack-operator-controller-manager-c86467d95-tnl8h\" (UID: \"0775c44c-131f-4a9c-89d5-bd724765e310\") " pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.639814 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ssmp\" (UniqueName: \"kubernetes.io/projected/29eab2f9-7ef6-4cc6-9f45-af32a4071a5d-kube-api-access-9ssmp\") pod \"watcher-operator-controller-manager-6b9957f54f-f5nmv\" (UID: \"29eab2f9-7ef6-4cc6-9f45-af32a4071a5d\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.639936 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kzdd\" (UniqueName: \"kubernetes.io/projected/e55f7c89-8011-437e-bcbc-b19ae9e25acd-kube-api-access-8kzdd\") pod \"test-operator-controller-manager-85777745bb-q6nwl\" (UID: \"e55f7c89-8011-437e-bcbc-b19ae9e25acd\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.640100 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c17b6bf-0c13-491a-977a-95566d56d7c4-cert\") pod \"infra-operator-controller-manager-9d6c5db85-bwj7p\" (UID: \"8c17b6bf-0c13-491a-977a-95566d56d7c4\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.640231 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cd4d\" (UniqueName: \"kubernetes.io/projected/0775c44c-131f-4a9c-89d5-bd724765e310-kube-api-access-6cd4d\") pod \"openstack-operator-controller-manager-c86467d95-tnl8h\" (UID: \"0775c44c-131f-4a9c-89d5-bd724765e310\") " pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" Oct 01 16:16:18 crc kubenswrapper[4764]: E1001 16:16:18.641885 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 01 16:16:18 crc kubenswrapper[4764]: E1001 16:16:18.642001 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c17b6bf-0c13-491a-977a-95566d56d7c4-cert podName:8c17b6bf-0c13-491a-977a-95566d56d7c4 nodeName:}" failed. No retries permitted until 2025-10-01 16:16:19.641988075 +0000 UTC m=+842.641634910 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c17b6bf-0c13-491a-977a-95566d56d7c4-cert") pod "infra-operator-controller-manager-9d6c5db85-bwj7p" (UID: "8c17b6bf-0c13-491a-977a-95566d56d7c4") : secret "infra-operator-webhook-server-cert" not found Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.678103 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kzdd\" (UniqueName: \"kubernetes.io/projected/e55f7c89-8011-437e-bcbc-b19ae9e25acd-kube-api-access-8kzdd\") pod \"test-operator-controller-manager-85777745bb-q6nwl\" (UID: \"e55f7c89-8011-437e-bcbc-b19ae9e25acd\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.681697 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ssmp\" (UniqueName: \"kubernetes.io/projected/29eab2f9-7ef6-4cc6-9f45-af32a4071a5d-kube-api-access-9ssmp\") pod \"watcher-operator-controller-manager-6b9957f54f-f5nmv\" (UID: \"29eab2f9-7ef6-4cc6-9f45-af32a4071a5d\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.683315 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.702691 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.742625 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcc47\" (UniqueName: \"kubernetes.io/projected/9bb8b56a-c568-4ea4-985e-a80d49b61197-kube-api-access-mcc47\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp\" (UID: \"9bb8b56a-c568-4ea4-985e-a80d49b61197\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.742746 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cd4d\" (UniqueName: \"kubernetes.io/projected/0775c44c-131f-4a9c-89d5-bd724765e310-kube-api-access-6cd4d\") pod \"openstack-operator-controller-manager-c86467d95-tnl8h\" (UID: \"0775c44c-131f-4a9c-89d5-bd724765e310\") " pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.742809 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0775c44c-131f-4a9c-89d5-bd724765e310-cert\") pod \"openstack-operator-controller-manager-c86467d95-tnl8h\" (UID: \"0775c44c-131f-4a9c-89d5-bd724765e310\") " pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" Oct 01 16:16:18 crc kubenswrapper[4764]: E1001 16:16:18.743289 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 01 16:16:18 crc kubenswrapper[4764]: E1001 16:16:18.743346 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0775c44c-131f-4a9c-89d5-bd724765e310-cert podName:0775c44c-131f-4a9c-89d5-bd724765e310 nodeName:}" failed. No retries permitted until 2025-10-01 16:16:19.243330364 +0000 UTC m=+842.242977199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0775c44c-131f-4a9c-89d5-bd724765e310-cert") pod "openstack-operator-controller-manager-c86467d95-tnl8h" (UID: "0775c44c-131f-4a9c-89d5-bd724765e310") : secret "webhook-server-cert" not found Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.763440 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cd4d\" (UniqueName: \"kubernetes.io/projected/0775c44c-131f-4a9c-89d5-bd724765e310-kube-api-access-6cd4d\") pod \"openstack-operator-controller-manager-c86467d95-tnl8h\" (UID: \"0775c44c-131f-4a9c-89d5-bd724765e310\") " pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.785759 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.846438 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.858724 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.859333 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcc47\" (UniqueName: \"kubernetes.io/projected/9bb8b56a-c568-4ea4-985e-a80d49b61197-kube-api-access-mcc47\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp\" (UID: \"9bb8b56a-c568-4ea4-985e-a80d49b61197\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.892501 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcc47\" (UniqueName: \"kubernetes.io/projected/9bb8b56a-c568-4ea4-985e-a80d49b61197-kube-api-access-mcc47\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp\" (UID: \"9bb8b56a-c568-4ea4-985e-a80d49b61197\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.905804 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.909087 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s"] Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.932071 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp" Oct 01 16:16:18 crc kubenswrapper[4764]: I1001 16:16:18.960856 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba9b6db9-115e-4760-aef3-107976da810e-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6\" (UID: \"ba9b6db9-115e-4760-aef3-107976da810e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" Oct 01 16:16:18 crc kubenswrapper[4764]: E1001 16:16:18.961240 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 16:16:18 crc kubenswrapper[4764]: E1001 16:16:18.961304 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba9b6db9-115e-4760-aef3-107976da810e-cert podName:ba9b6db9-115e-4760-aef3-107976da810e nodeName:}" failed. No retries permitted until 2025-10-01 16:16:19.961285459 +0000 UTC m=+842.960932294 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba9b6db9-115e-4760-aef3-107976da810e-cert") pod "openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" (UID: "ba9b6db9-115e-4760-aef3-107976da810e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.170472 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.170526 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.187558 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt"] Oct 01 16:16:19 crc kubenswrapper[4764]: W1001 16:16:19.271148 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fe3c02d_9e92_4628_8d94_6797d56fe480.slice/crio-a7ac58e45bc9cc044f428fadc6b797d4e80d4d65eff66120efa4d3a61fcedb49 WatchSource:0}: Error finding container a7ac58e45bc9cc044f428fadc6b797d4e80d4d65eff66120efa4d3a61fcedb49: Status 404 returned error can't find the container with id a7ac58e45bc9cc044f428fadc6b797d4e80d4d65eff66120efa4d3a61fcedb49 Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.271860 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-k6866" podUID="1709385e-fe6c-443e-a437-ceded08bde5b" containerName="registry-server" probeResult="failure" output=< Oct 01 16:16:19 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Oct 01 16:16:19 crc kubenswrapper[4764]: > Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.273838 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0775c44c-131f-4a9c-89d5-bd724765e310-cert\") pod \"openstack-operator-controller-manager-c86467d95-tnl8h\" (UID: \"0775c44c-131f-4a9c-89d5-bd724765e310\") " pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" Oct 01 16:16:19 crc kubenswrapper[4764]: W1001 16:16:19.274223 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fff9cd4_9690_4a70_a578_0eadbcbb47d6.slice/crio-905435dc1cec4d10e6e19914cd7a1248f5a98daa503c8bbf116fee57b0bc054a WatchSource:0}: Error finding container 905435dc1cec4d10e6e19914cd7a1248f5a98daa503c8bbf116fee57b0bc054a: Status 404 returned error can't find the container with id 905435dc1cec4d10e6e19914cd7a1248f5a98daa503c8bbf116fee57b0bc054a Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.278498 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0775c44c-131f-4a9c-89d5-bd724765e310-cert\") pod \"openstack-operator-controller-manager-c86467d95-tnl8h\" (UID: \"0775c44c-131f-4a9c-89d5-bd724765e310\") " pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.346866 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.378225 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.419068 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-lms99"] Oct 01 16:16:19 crc kubenswrapper[4764]: W1001 16:16:19.432325 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4b31c01_ec06_434e_af2a_228a1ee7ec19.slice/crio-4c8efd5670fadae29a17da0bd36cbcc54230729293e3a0e494ed7f93ee1529fc WatchSource:0}: Error finding container 4c8efd5670fadae29a17da0bd36cbcc54230729293e3a0e494ed7f93ee1529fc: Status 404 returned error can't find the container with id 4c8efd5670fadae29a17da0bd36cbcc54230729293e3a0e494ed7f93ee1529fc Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.517130 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.545177 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.562087 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.585085 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-lms99" event={"ID":"e4b31c01-ec06-434e-af2a-228a1ee7ec19","Type":"ContainerStarted","Data":"4c8efd5670fadae29a17da0bd36cbcc54230729293e3a0e494ed7f93ee1529fc"} Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.586613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s" event={"ID":"9973c37b-d58f-48b0-8c1e-707576e2cb09","Type":"ContainerStarted","Data":"cb7cc90b475931b7785692c18b3e748c1ecd7d6b0c43c89133934847e98ab855"} Oct 01 16:16:19 crc kubenswrapper[4764]: W1001 16:16:19.588706 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c0305d_d391_4892_869d_f5702a69cc45.slice/crio-04b318a264b0bf13f678a0bc13dc1851a4ce9635a4d1bb46bbe215902436c89a WatchSource:0}: Error finding container 04b318a264b0bf13f678a0bc13dc1851a4ce9635a4d1bb46bbe215902436c89a: Status 404 returned error can't find the container with id 04b318a264b0bf13f678a0bc13dc1851a4ce9635a4d1bb46bbe215902436c89a Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.588795 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67" event={"ID":"0fe3c02d-9e92-4628-8d94-6797d56fe480","Type":"ContainerStarted","Data":"a7ac58e45bc9cc044f428fadc6b797d4e80d4d65eff66120efa4d3a61fcedb49"} Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.597762 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4" event={"ID":"74ebea6a-ca87-4ccb-ab25-3c4899c04d39","Type":"ContainerStarted","Data":"855aba2d299f45bad62b6bbdba7e8d209417d5d680a029d0913dd75ff7cbcab4"} Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.602213 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b" event={"ID":"af86c8bd-6b9f-4cf1-8ffc-d441a90f25fd","Type":"ContainerStarted","Data":"ca9e57ef575fe017f6fb46352ca2b189ee509392d1bb44856df022fab9370c37"} Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.603378 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz" event={"ID":"db117cfb-3e46-4428-93b1-44a66101c57d","Type":"ContainerStarted","Data":"e5c9deeca9a3a00382ad17d65633f8fbf7400138332a2c4d82b16111680d81bf"} Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.604461 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r" event={"ID":"323e4260-1016-4601-a8c1-f75641230fdb","Type":"ContainerStarted","Data":"244a1a95e3e6f8b81aee0af8400e890d89fb0866160a9bef3db9b5a31bf783ba"} Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.605919 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt" event={"ID":"0fff9cd4-9690-4a70-a578-0eadbcbb47d6","Type":"ContainerStarted","Data":"905435dc1cec4d10e6e19914cd7a1248f5a98daa503c8bbf116fee57b0bc054a"} Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.678831 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c17b6bf-0c13-491a-977a-95566d56d7c4-cert\") pod \"infra-operator-controller-manager-9d6c5db85-bwj7p\" (UID: \"8c17b6bf-0c13-491a-977a-95566d56d7c4\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.690866 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c17b6bf-0c13-491a-977a-95566d56d7c4-cert\") pod \"infra-operator-controller-manager-9d6c5db85-bwj7p\" (UID: \"8c17b6bf-0c13-491a-977a-95566d56d7c4\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.887492 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.905241 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.914976 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-q6nwl"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.918341 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.924755 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.931678 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.937751 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.941842 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.960952 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.964983 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb"] Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.971558 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp"] Oct 01 16:16:19 crc kubenswrapper[4764]: E1001 16:16:19.975476 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qlhk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-849d5b9b84-z9988_openstack-operators(53b1bb68-341f-4635-8339-ff10c9b08dee): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 16:16:19 crc kubenswrapper[4764]: E1001 16:16:19.976565 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7b4rx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-rkr2b_openstack-operators(33cb2692-6fcf-4af5-bf43-697a4a740c19): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 16:16:19 crc kubenswrapper[4764]: E1001 16:16:19.977361 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9ssmp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6b9957f54f-f5nmv_openstack-operators(29eab2f9-7ef6-4cc6-9f45-af32a4071a5d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.980426 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h"] Oct 01 16:16:19 crc kubenswrapper[4764]: W1001 16:16:19.980489 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29eab2f9_7ef6_4cc6_9f45_af32a4071a5d.slice/crio-5ed76b7e3d80daa7932682e7ceb6ec4b0073e3af2addd6103052f27d96f2b442 WatchSource:0}: Error finding container 5ed76b7e3d80daa7932682e7ceb6ec4b0073e3af2addd6103052f27d96f2b442: Status 404 returned error can't find the container with id 5ed76b7e3d80daa7932682e7ceb6ec4b0073e3af2addd6103052f27d96f2b442 Oct 01 16:16:19 crc kubenswrapper[4764]: E1001 16:16:19.983410 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mcc47,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp_openstack-operators(9bb8b56a-c568-4ea4-985e-a80d49b61197): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 16:16:19 crc kubenswrapper[4764]: E1001 16:16:19.983555 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8kzdd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-85777745bb-q6nwl_openstack-operators(e55f7c89-8011-437e-bcbc-b19ae9e25acd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 16:16:19 crc kubenswrapper[4764]: E1001 16:16:19.984737 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp" podUID="9bb8b56a-c568-4ea4-985e-a80d49b61197" Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.985337 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" Oct 01 16:16:19 crc kubenswrapper[4764]: W1001 16:16:19.989577 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod705820be_248f_49fb_9ace_0f333674985a.slice/crio-c0182105b7f79f43715f179454841412d4410a2fa79247d3c465e36bda08228f WatchSource:0}: Error finding container c0182105b7f79f43715f179454841412d4410a2fa79247d3c465e36bda08228f: Status 404 returned error can't find the container with id c0182105b7f79f43715f179454841412d4410a2fa79247d3c465e36bda08228f Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.991224 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba9b6db9-115e-4760-aef3-107976da810e-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6\" (UID: \"ba9b6db9-115e-4760-aef3-107976da810e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" Oct 01 16:16:19 crc kubenswrapper[4764]: I1001 16:16:19.996451 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba9b6db9-115e-4760-aef3-107976da810e-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6\" (UID: \"ba9b6db9-115e-4760-aef3-107976da810e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" Oct 01 16:16:19 crc kubenswrapper[4764]: W1001 16:16:19.997306 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0775c44c_131f_4a9c_89d5_bd724765e310.slice/crio-e34a38787e1c33c5a9257d9c6224d3074d925c54fe280b3e6450d7ea6c7285c2 WatchSource:0}: Error finding container e34a38787e1c33c5a9257d9c6224d3074d925c54fe280b3e6450d7ea6c7285c2: Status 404 returned error can't find the container with id e34a38787e1c33c5a9257d9c6224d3074d925c54fe280b3e6450d7ea6c7285c2 Oct 01 16:16:20 crc kubenswrapper[4764]: E1001 16:16:20.011735 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-md87n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-84d6b4b759-vbgxb_openstack-operators(705820be-248f-49fb-9ace-0f333674985a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.220266 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" Oct 01 16:16:20 crc kubenswrapper[4764]: E1001 16:16:20.231656 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" podUID="53b1bb68-341f-4635-8339-ff10c9b08dee" Oct 01 16:16:20 crc kubenswrapper[4764]: E1001 16:16:20.231790 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" podUID="29eab2f9-7ef6-4cc6-9f45-af32a4071a5d" Oct 01 16:16:20 crc kubenswrapper[4764]: E1001 16:16:20.257879 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" podUID="33cb2692-6fcf-4af5-bf43-697a4a740c19" Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.278881 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p"] Oct 01 16:16:20 crc kubenswrapper[4764]: W1001 16:16:20.304277 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c17b6bf_0c13_491a_977a_95566d56d7c4.slice/crio-207b887053c6d6c89dc7c71d5a5ca4c901509bcf25d89e1f92be64b8453ae6b4 WatchSource:0}: Error finding container 207b887053c6d6c89dc7c71d5a5ca4c901509bcf25d89e1f92be64b8453ae6b4: Status 404 returned error can't find the container with id 207b887053c6d6c89dc7c71d5a5ca4c901509bcf25d89e1f92be64b8453ae6b4 Oct 01 16:16:20 crc kubenswrapper[4764]: E1001 16:16:20.382109 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" podUID="705820be-248f-49fb-9ace-0f333674985a" Oct 01 16:16:20 crc kubenswrapper[4764]: E1001 16:16:20.382690 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" podUID="e55f7c89-8011-437e-bcbc-b19ae9e25acd" Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.620502 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" event={"ID":"29eab2f9-7ef6-4cc6-9f45-af32a4071a5d","Type":"ContainerStarted","Data":"7c02a648cdbbd2195c24fe7a371a9aea74c7be2bb2bfc9e5ab0a0ff7f85b34bf"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.620555 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" event={"ID":"29eab2f9-7ef6-4cc6-9f45-af32a4071a5d","Type":"ContainerStarted","Data":"5ed76b7e3d80daa7932682e7ceb6ec4b0073e3af2addd6103052f27d96f2b442"} Oct 01 16:16:20 crc kubenswrapper[4764]: E1001 16:16:20.622675 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" podUID="29eab2f9-7ef6-4cc6-9f45-af32a4071a5d" Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.623692 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" event={"ID":"e55f7c89-8011-437e-bcbc-b19ae9e25acd","Type":"ContainerStarted","Data":"0ecbb8140361d0f9855f1a6c8d6f49db557d1e54706142014a44daca090374fe"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.623736 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" event={"ID":"e55f7c89-8011-437e-bcbc-b19ae9e25acd","Type":"ContainerStarted","Data":"1877ac011eaf87393877f4820e94b67dfc11fb5beeb91a98e0361ef2ac1900be"} Oct 01 16:16:20 crc kubenswrapper[4764]: E1001 16:16:20.625079 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" podUID="e55f7c89-8011-437e-bcbc-b19ae9e25acd" Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.636393 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" event={"ID":"705820be-248f-49fb-9ace-0f333674985a","Type":"ContainerStarted","Data":"cb5e756f82130ac9e1cfbdd7c9cd5ef48e11a67a23b8d92bb66722ba6813e62c"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.636450 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" event={"ID":"705820be-248f-49fb-9ace-0f333674985a","Type":"ContainerStarted","Data":"c0182105b7f79f43715f179454841412d4410a2fa79247d3c465e36bda08228f"} Oct 01 16:16:20 crc kubenswrapper[4764]: E1001 16:16:20.641186 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" podUID="705820be-248f-49fb-9ace-0f333674985a" Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.643703 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk" event={"ID":"10d8bbe9-a54f-489a-8fdc-a6acf5b6a46b","Type":"ContainerStarted","Data":"eca817a2c26cb285c0a121fcca25060e1913f09545f25d4d4d74e81fe056c6d9"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.649497 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" event={"ID":"8c17b6bf-0c13-491a-977a-95566d56d7c4","Type":"ContainerStarted","Data":"207b887053c6d6c89dc7c71d5a5ca4c901509bcf25d89e1f92be64b8453ae6b4"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.659976 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp" event={"ID":"9bb8b56a-c568-4ea4-985e-a80d49b61197","Type":"ContainerStarted","Data":"8a6b5f7036a7b9e8e279b7fa2fe6cd4358d76caed832aab3741b6a1c5689a121"} Oct 01 16:16:20 crc kubenswrapper[4764]: E1001 16:16:20.663724 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp" podUID="9bb8b56a-c568-4ea4-985e-a80d49b61197" Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.664232 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd" event={"ID":"d591342f-60e7-48db-9073-b2d6e9fe6992","Type":"ContainerStarted","Data":"ba57007a5251ae9dcb9bf3c0303aa012c08901708900c8b84fd4cf87850b001f"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.671001 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r" event={"ID":"394ffa3b-3cd6-4deb-a436-624fa75155a2","Type":"ContainerStarted","Data":"72af8d4407eff2e971eceacb5e42d78c642bb8624d6e479d7ee2562a234d2f86"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.674591 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc" event={"ID":"67c0305d-d391-4892-869d-f5702a69cc45","Type":"ContainerStarted","Data":"04b318a264b0bf13f678a0bc13dc1851a4ce9635a4d1bb46bbe215902436c89a"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.676098 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" event={"ID":"33cb2692-6fcf-4af5-bf43-697a4a740c19","Type":"ContainerStarted","Data":"b2e8c25ddbe299dfbe7e8a92a224cd081bdc83aec7264e4bae8e9ee6d9b24b62"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.676127 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" event={"ID":"33cb2692-6fcf-4af5-bf43-697a4a740c19","Type":"ContainerStarted","Data":"2c325cf2eee1dd80cc44b42f76f63b5aad16f08b8fc73e5da07a84d495634ad3"} Oct 01 16:16:20 crc kubenswrapper[4764]: E1001 16:16:20.701269 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" podUID="33cb2692-6fcf-4af5-bf43-697a4a740c19" Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.703380 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl" event={"ID":"2050b8cd-91c1-4962-b346-bbfa5c4e652e","Type":"ContainerStarted","Data":"38ce052f8471c9c293477e8f2fea5e9b74499604b534f1315566491e6137d435"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.712342 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" event={"ID":"0775c44c-131f-4a9c-89d5-bd724765e310","Type":"ContainerStarted","Data":"a2082e44fe164903b464ee92f129ed82771531c982d2c09d6e45bfd30879a330"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.712391 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" event={"ID":"0775c44c-131f-4a9c-89d5-bd724765e310","Type":"ContainerStarted","Data":"30691c631a9b1e73c12194d92f375d7b59d046029145128b7e1eb5f55bfef73d"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.712404 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" event={"ID":"0775c44c-131f-4a9c-89d5-bd724765e310","Type":"ContainerStarted","Data":"e34a38787e1c33c5a9257d9c6224d3074d925c54fe280b3e6450d7ea6c7285c2"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.712943 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.726798 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" event={"ID":"53b1bb68-341f-4635-8339-ff10c9b08dee","Type":"ContainerStarted","Data":"a8d481016907d98e60b561429b5a7c44e712133210affb9d93ceb258a9eb03f4"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.726846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" event={"ID":"53b1bb68-341f-4635-8339-ff10c9b08dee","Type":"ContainerStarted","Data":"838aa4bc2bf93541c124d1992a054e12ecf8bb7d0852755a03aeae902f67c382"} Oct 01 16:16:20 crc kubenswrapper[4764]: E1001 16:16:20.729240 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" podUID="53b1bb68-341f-4635-8339-ff10c9b08dee" Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.733295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4" event={"ID":"c535869c-c448-4bea-944d-fce55ddd334c","Type":"ContainerStarted","Data":"0d15a06974dcadf9830c0a884314fb417088b0c085b98dde041ff2823f0843fe"} Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.740098 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6"] Oct 01 16:16:20 crc kubenswrapper[4764]: I1001 16:16:20.761459 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" podStartSLOduration=2.761037056 podStartE2EDuration="2.761037056s" podCreationTimestamp="2025-10-01 16:16:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:16:20.753415457 +0000 UTC m=+843.753062292" watchObservedRunningTime="2025-10-01 16:16:20.761037056 +0000 UTC m=+843.760683891" Oct 01 16:16:21 crc kubenswrapper[4764]: I1001 16:16:21.747026 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" event={"ID":"ba9b6db9-115e-4760-aef3-107976da810e","Type":"ContainerStarted","Data":"6f585af548d9c21140dfb63f4532af72bef3f9504e10f02c9d6a985f33dbbdda"} Oct 01 16:16:21 crc kubenswrapper[4764]: E1001 16:16:21.753309 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" podUID="e55f7c89-8011-437e-bcbc-b19ae9e25acd" Oct 01 16:16:21 crc kubenswrapper[4764]: E1001 16:16:21.753654 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" podUID="29eab2f9-7ef6-4cc6-9f45-af32a4071a5d" Oct 01 16:16:21 crc kubenswrapper[4764]: E1001 16:16:21.753694 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" podUID="33cb2692-6fcf-4af5-bf43-697a4a740c19" Oct 01 16:16:21 crc kubenswrapper[4764]: E1001 16:16:21.753732 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp" podUID="9bb8b56a-c568-4ea4-985e-a80d49b61197" Oct 01 16:16:21 crc kubenswrapper[4764]: E1001 16:16:21.753863 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" podUID="53b1bb68-341f-4635-8339-ff10c9b08dee" Oct 01 16:16:21 crc kubenswrapper[4764]: E1001 16:16:21.753898 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" podUID="705820be-248f-49fb-9ace-0f333674985a" Oct 01 16:16:21 crc kubenswrapper[4764]: I1001 16:16:21.914270 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:16:21 crc kubenswrapper[4764]: I1001 16:16:21.914531 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:16:21 crc kubenswrapper[4764]: I1001 16:16:21.914569 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:16:21 crc kubenswrapper[4764]: I1001 16:16:21.915087 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36994ceb1acaf44344047ef2a5795d007fa57999fff00a6c7967859219769b54"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:16:21 crc kubenswrapper[4764]: I1001 16:16:21.915129 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://36994ceb1acaf44344047ef2a5795d007fa57999fff00a6c7967859219769b54" gracePeriod=600 Oct 01 16:16:22 crc kubenswrapper[4764]: I1001 16:16:22.757444 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="36994ceb1acaf44344047ef2a5795d007fa57999fff00a6c7967859219769b54" exitCode=0 Oct 01 16:16:22 crc kubenswrapper[4764]: I1001 16:16:22.757985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"36994ceb1acaf44344047ef2a5795d007fa57999fff00a6c7967859219769b54"} Oct 01 16:16:22 crc kubenswrapper[4764]: I1001 16:16:22.758013 4764 scope.go:117] "RemoveContainer" containerID="c9d37c73ed33c3edc83cd30171905ddb550eb174fab70b91e5c87cb08088ccc7" Oct 01 16:16:28 crc kubenswrapper[4764]: I1001 16:16:28.211266 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:28 crc kubenswrapper[4764]: I1001 16:16:28.265984 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k6866" Oct 01 16:16:28 crc kubenswrapper[4764]: I1001 16:16:28.337256 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k6866"] Oct 01 16:16:28 crc kubenswrapper[4764]: I1001 16:16:28.445323 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qn928"] Oct 01 16:16:28 crc kubenswrapper[4764]: I1001 16:16:28.445548 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qn928" podUID="3ced474e-2212-4ae4-b305-fbc1f0e05a93" containerName="registry-server" containerID="cri-o://7792bab82a75c60a89818ad1692f1c46bde38d2ac4a9f1cd5de1ded2b947ac82" gracePeriod=2 Oct 01 16:16:28 crc kubenswrapper[4764]: I1001 16:16:28.810625 4764 generic.go:334] "Generic (PLEG): container finished" podID="3ced474e-2212-4ae4-b305-fbc1f0e05a93" containerID="7792bab82a75c60a89818ad1692f1c46bde38d2ac4a9f1cd5de1ded2b947ac82" exitCode=0 Oct 01 16:16:28 crc kubenswrapper[4764]: I1001 16:16:28.811101 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qn928" event={"ID":"3ced474e-2212-4ae4-b305-fbc1f0e05a93","Type":"ContainerDied","Data":"7792bab82a75c60a89818ad1692f1c46bde38d2ac4a9f1cd5de1ded2b947ac82"} Oct 01 16:16:29 crc kubenswrapper[4764]: I1001 16:16:29.524585 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-c86467d95-tnl8h" Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.308980 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.474858 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s9zt\" (UniqueName: \"kubernetes.io/projected/3ced474e-2212-4ae4-b305-fbc1f0e05a93-kube-api-access-8s9zt\") pod \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\" (UID: \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\") " Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.474991 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ced474e-2212-4ae4-b305-fbc1f0e05a93-utilities\") pod \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\" (UID: \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\") " Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.475080 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ced474e-2212-4ae4-b305-fbc1f0e05a93-catalog-content\") pod \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\" (UID: \"3ced474e-2212-4ae4-b305-fbc1f0e05a93\") " Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.477389 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ced474e-2212-4ae4-b305-fbc1f0e05a93-utilities" (OuterVolumeSpecName: "utilities") pod "3ced474e-2212-4ae4-b305-fbc1f0e05a93" (UID: "3ced474e-2212-4ae4-b305-fbc1f0e05a93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.487513 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ced474e-2212-4ae4-b305-fbc1f0e05a93-kube-api-access-8s9zt" (OuterVolumeSpecName: "kube-api-access-8s9zt") pod "3ced474e-2212-4ae4-b305-fbc1f0e05a93" (UID: "3ced474e-2212-4ae4-b305-fbc1f0e05a93"). InnerVolumeSpecName "kube-api-access-8s9zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.541350 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ced474e-2212-4ae4-b305-fbc1f0e05a93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ced474e-2212-4ae4-b305-fbc1f0e05a93" (UID: "3ced474e-2212-4ae4-b305-fbc1f0e05a93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.576622 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ced474e-2212-4ae4-b305-fbc1f0e05a93-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.576657 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s9zt\" (UniqueName: \"kubernetes.io/projected/3ced474e-2212-4ae4-b305-fbc1f0e05a93-kube-api-access-8s9zt\") on node \"crc\" DevicePath \"\"" Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.576669 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ced474e-2212-4ae4-b305-fbc1f0e05a93-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.823421 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r" event={"ID":"394ffa3b-3cd6-4deb-a436-624fa75155a2","Type":"ContainerStarted","Data":"6ee6f184c9d48a56d64cea857c1bfe5f0ddc684b9aa13375a5906b7cde02d669"} Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.825507 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qn928" event={"ID":"3ced474e-2212-4ae4-b305-fbc1f0e05a93","Type":"ContainerDied","Data":"40b998128c6d28131fe4e858f32e7709675b31011faa571ca297c4b89f121fcd"} Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.825551 4764 scope.go:117] "RemoveContainer" containerID="7792bab82a75c60a89818ad1692f1c46bde38d2ac4a9f1cd5de1ded2b947ac82" Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.825625 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qn928" Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.841586 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt" event={"ID":"0fff9cd4-9690-4a70-a578-0eadbcbb47d6","Type":"ContainerStarted","Data":"38111e640b775544269f41b36f90b078e0e1ddd51b126e72ec01070faf3ade3b"} Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.843668 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4" event={"ID":"c535869c-c448-4bea-944d-fce55ddd334c","Type":"ContainerStarted","Data":"a13aca1721ed8b6252274abe150b591c72f1cb699fd5e4e7f436b391c057cc87"} Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.846855 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4" event={"ID":"74ebea6a-ca87-4ccb-ab25-3c4899c04d39","Type":"ContainerStarted","Data":"595eed8015d106da47df93a5ea1ac1b3e783858eecca7c80f6c578b204702ec4"} Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.849622 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"996ccf5d7c8e2755552554ff5a74e5db9102336da04bc8666a5ec3ad70d33d62"} Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.852463 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk" event={"ID":"10d8bbe9-a54f-489a-8fdc-a6acf5b6a46b","Type":"ContainerStarted","Data":"1b862426e7b208b93e023d5fc1c669cbfe4fbabb78b368c502995be4d9a3885f"} Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.857567 4764 scope.go:117] "RemoveContainer" containerID="671ea14b516816b121dfe830d1e16d8c2f1d696957ec68f1045f5694563bd81b" Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.868925 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qn928"] Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.883944 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qn928"] Oct 01 16:16:30 crc kubenswrapper[4764]: I1001 16:16:30.905908 4764 scope.go:117] "RemoveContainer" containerID="e5c7055262404416072f800f0a273246ea44ac972902c5c4aa55f33cbae39cce" Oct 01 16:16:31 crc kubenswrapper[4764]: I1001 16:16:31.741020 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ced474e-2212-4ae4-b305-fbc1f0e05a93" path="/var/lib/kubelet/pods/3ced474e-2212-4ae4-b305-fbc1f0e05a93/volumes" Oct 01 16:16:33 crc kubenswrapper[4764]: I1001 16:16:33.886486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s" event={"ID":"9973c37b-d58f-48b0-8c1e-707576e2cb09","Type":"ContainerStarted","Data":"8958fbf403cbc718c405109ff471fbf67e65be63346774d7e097f1dc5d4738d9"} Oct 01 16:16:33 crc kubenswrapper[4764]: I1001 16:16:33.888926 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd" event={"ID":"d591342f-60e7-48db-9073-b2d6e9fe6992","Type":"ContainerStarted","Data":"b3ba5e7b332fb10e0547ae25ec5526cebb5647247c64fe8e3b9c20ac7269ef84"} Oct 01 16:16:34 crc kubenswrapper[4764]: I1001 16:16:34.895834 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-lms99" event={"ID":"e4b31c01-ec06-434e-af2a-228a1ee7ec19","Type":"ContainerStarted","Data":"61170b9a88da0c5b67e3c9e2f200af4a18f7b3029e0915edcc20ea2c3d668ee3"} Oct 01 16:16:34 crc kubenswrapper[4764]: I1001 16:16:34.897295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc" event={"ID":"67c0305d-d391-4892-869d-f5702a69cc45","Type":"ContainerStarted","Data":"4fe67e7eaa24ae140c59c6b3780c1601d8f3e9ec13f41d08a40950d713c847b9"} Oct 01 16:16:34 crc kubenswrapper[4764]: I1001 16:16:34.898666 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b" event={"ID":"af86c8bd-6b9f-4cf1-8ffc-d441a90f25fd","Type":"ContainerStarted","Data":"4a684a152a6602beec168c95feba2ba53564bf320c8b9ecd0b74f0a7c8489726"} Oct 01 16:16:34 crc kubenswrapper[4764]: I1001 16:16:34.899763 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl" event={"ID":"2050b8cd-91c1-4962-b346-bbfa5c4e652e","Type":"ContainerStarted","Data":"4ebc08e350dee1de6a6c010ff5346330b7d004e619453324711a14222f9baaef"} Oct 01 16:16:34 crc kubenswrapper[4764]: I1001 16:16:34.900929 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r" event={"ID":"323e4260-1016-4601-a8c1-f75641230fdb","Type":"ContainerStarted","Data":"2143ab128f7865c96a1b98e160453aca9e050c3b5287b357dad757af14033871"} Oct 01 16:16:34 crc kubenswrapper[4764]: I1001 16:16:34.902438 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" event={"ID":"8c17b6bf-0c13-491a-977a-95566d56d7c4","Type":"ContainerStarted","Data":"be343b4893e56c3e8906a3d6efc1a23be98d77d23e21e3024ad93a04fe0f51b2"} Oct 01 16:16:34 crc kubenswrapper[4764]: I1001 16:16:34.903473 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" event={"ID":"ba9b6db9-115e-4760-aef3-107976da810e","Type":"ContainerStarted","Data":"bb126b5910df0a9e7939b42ffe866e49461a54ecca0487b58f869a1f2152b6cf"} Oct 01 16:16:34 crc kubenswrapper[4764]: I1001 16:16:34.904626 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz" event={"ID":"db117cfb-3e46-4428-93b1-44a66101c57d","Type":"ContainerStarted","Data":"bd4b1c81cabc0b895bf9265256e4f21e7a67a9fc1a698fae493b5b745a3b1c71"} Oct 01 16:16:34 crc kubenswrapper[4764]: I1001 16:16:34.906031 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt" event={"ID":"0fff9cd4-9690-4a70-a578-0eadbcbb47d6","Type":"ContainerStarted","Data":"e438ac5ca1272038610b4d0b7d8a976316bba83dbdd6c9d9b83ac204576569b7"} Oct 01 16:16:34 crc kubenswrapper[4764]: I1001 16:16:34.907325 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4" event={"ID":"c535869c-c448-4bea-944d-fce55ddd334c","Type":"ContainerStarted","Data":"abbd28b2210404cd5b1bbf4ab3538449c3f1f594df0bebb49436e172c9149060"} Oct 01 16:16:34 crc kubenswrapper[4764]: I1001 16:16:34.908813 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67" event={"ID":"0fe3c02d-9e92-4628-8d94-6797d56fe480","Type":"ContainerStarted","Data":"dc109d718f3d43a54552cfd5e08e8f786cc1884bda2f66e70162c3559e2db675"} Oct 01 16:16:34 crc kubenswrapper[4764]: I1001 16:16:34.910618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4" event={"ID":"74ebea6a-ca87-4ccb-ab25-3c4899c04d39","Type":"ContainerStarted","Data":"5f642a137594a2454d4c5c806cf2f250ab809c67aa8785e9c6616067f0745d0f"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.932391 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" event={"ID":"ba9b6db9-115e-4760-aef3-107976da810e","Type":"ContainerStarted","Data":"73ba2d9b451dd8858472980ea1279fcf402dfa245c7e8f04a5aadd10832951bb"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.933781 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.938128 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc" event={"ID":"67c0305d-d391-4892-869d-f5702a69cc45","Type":"ContainerStarted","Data":"16914300fc287a6d3bac12076d986f6ae3c8bf823e14358c5ee5011eeb3e671b"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.938242 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.940004 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r" event={"ID":"323e4260-1016-4601-a8c1-f75641230fdb","Type":"ContainerStarted","Data":"360484e62ff15a396587b4cdc1fe9c7024c3ec2361f7b6eb3bb09b8fa829e028"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.940415 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.943497 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz" event={"ID":"db117cfb-3e46-4428-93b1-44a66101c57d","Type":"ContainerStarted","Data":"6e2985ba6e65b34563fa3fdeb788d70b8d3f5bcdd2e5c9f6f67804e568b89955"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.943854 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.945731 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r" event={"ID":"394ffa3b-3cd6-4deb-a436-624fa75155a2","Type":"ContainerStarted","Data":"e16d3e8dabe84926bbfea0c991c2ef3b036bdb95f0fb2c0b0f4a888d3d5c9258"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.946391 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.948299 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl" event={"ID":"2050b8cd-91c1-4962-b346-bbfa5c4e652e","Type":"ContainerStarted","Data":"12fae30814c4fd43c07e265a4703d53c4f15e356d33689f0f53f63188663e403"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.948665 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.950120 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" event={"ID":"8c17b6bf-0c13-491a-977a-95566d56d7c4","Type":"ContainerStarted","Data":"6b40e5841df5514b249642f11405d2af5b3532d5cf21edba25b05b4135b27a05"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.950500 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.950658 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.951754 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67" event={"ID":"0fe3c02d-9e92-4628-8d94-6797d56fe480","Type":"ContainerStarted","Data":"2f254564912b9fc4302e50be9ebf50a37eff19538f385a11a706ff1fe4dc9a64"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.952167 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.962907 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk" event={"ID":"10d8bbe9-a54f-489a-8fdc-a6acf5b6a46b","Type":"ContainerStarted","Data":"9f517b12acf026898e19978b4d0d220b1a9511c8bddcdfbb241982c6f2627220"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.963178 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.965915 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" podStartSLOduration=8.737616414 podStartE2EDuration="17.965895851s" podCreationTimestamp="2025-10-01 16:16:18 +0000 UTC" firstStartedPulling="2025-10-01 16:16:20.777958515 +0000 UTC m=+843.777605350" lastFinishedPulling="2025-10-01 16:16:30.006237952 +0000 UTC m=+853.005884787" observedRunningTime="2025-10-01 16:16:35.959165704 +0000 UTC m=+858.958812559" watchObservedRunningTime="2025-10-01 16:16:35.965895851 +0000 UTC m=+858.965542686" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.968310 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.968940 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd" event={"ID":"d591342f-60e7-48db-9073-b2d6e9fe6992","Type":"ContainerStarted","Data":"d576b564ddd6d534072a0069821e91b49a14fb88186971e888e16c63945c01ea"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.969245 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.973696 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b" event={"ID":"af86c8bd-6b9f-4cf1-8ffc-d441a90f25fd","Type":"ContainerStarted","Data":"38f8aea483703c58e30e3e2944d395a64fc9e499b911f8783a9d95b91e19f8c2"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.974185 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.976411 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc" podStartSLOduration=8.596815799 podStartE2EDuration="18.97639972s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.596550307 +0000 UTC m=+842.596197142" lastFinishedPulling="2025-10-01 16:16:29.976134228 +0000 UTC m=+852.975781063" observedRunningTime="2025-10-01 16:16:35.974468372 +0000 UTC m=+858.974115207" watchObservedRunningTime="2025-10-01 16:16:35.97639972 +0000 UTC m=+858.976046555" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.980452 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s" event={"ID":"9973c37b-d58f-48b0-8c1e-707576e2cb09","Type":"ContainerStarted","Data":"8b6f90158ad95f7ce54dac9d36ec399224e09ef4925fcf068f57c954f8ffcf44"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.981008 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.987500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-lms99" event={"ID":"e4b31c01-ec06-434e-af2a-228a1ee7ec19","Type":"ContainerStarted","Data":"fc11ae35b445894039ad157e52ecd74c90925532fefe8c07ad3f87b3d76b4ffe"} Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.987536 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.987547 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-lms99" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.990257 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.990297 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4" Oct 01 16:16:35 crc kubenswrapper[4764]: I1001 16:16:35.993350 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.007641 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.007776 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.032491 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r" podStartSLOduration=8.423104785 podStartE2EDuration="19.032471743s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.386325113 +0000 UTC m=+842.385971948" lastFinishedPulling="2025-10-01 16:16:29.995692071 +0000 UTC m=+852.995338906" observedRunningTime="2025-10-01 16:16:36.005298783 +0000 UTC m=+859.004945618" watchObservedRunningTime="2025-10-01 16:16:36.032471743 +0000 UTC m=+859.032118578" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.034755 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz" podStartSLOduration=8.625182385 podStartE2EDuration="19.034740839s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.56759384 +0000 UTC m=+842.567240665" lastFinishedPulling="2025-10-01 16:16:29.977152284 +0000 UTC m=+852.976799119" observedRunningTime="2025-10-01 16:16:36.033753215 +0000 UTC m=+859.033400050" watchObservedRunningTime="2025-10-01 16:16:36.034740839 +0000 UTC m=+859.034387674" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.053599 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" podStartSLOduration=9.375700255 podStartE2EDuration="19.053583314s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:20.308359489 +0000 UTC m=+843.308006324" lastFinishedPulling="2025-10-01 16:16:29.986242548 +0000 UTC m=+852.985889383" observedRunningTime="2025-10-01 16:16:36.051998555 +0000 UTC m=+859.051645390" watchObservedRunningTime="2025-10-01 16:16:36.053583314 +0000 UTC m=+859.053230149" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.070804 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl" podStartSLOduration=9.004265558 podStartE2EDuration="19.070789248s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.918110018 +0000 UTC m=+842.917756853" lastFinishedPulling="2025-10-01 16:16:29.984633718 +0000 UTC m=+852.984280543" observedRunningTime="2025-10-01 16:16:36.068963994 +0000 UTC m=+859.068610829" watchObservedRunningTime="2025-10-01 16:16:36.070789248 +0000 UTC m=+859.070436083" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.088813 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-59r2r" podStartSLOduration=9.071004448 podStartE2EDuration="19.088797643s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.959112403 +0000 UTC m=+842.958759238" lastFinishedPulling="2025-10-01 16:16:29.976905598 +0000 UTC m=+852.976552433" observedRunningTime="2025-10-01 16:16:36.08745516 +0000 UTC m=+859.087101995" watchObservedRunningTime="2025-10-01 16:16:36.088797643 +0000 UTC m=+859.088444478" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.105559 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67" podStartSLOduration=8.396295844 podStartE2EDuration="19.105543186s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.275107859 +0000 UTC m=+842.274754694" lastFinishedPulling="2025-10-01 16:16:29.984355201 +0000 UTC m=+852.984002036" observedRunningTime="2025-10-01 16:16:36.104655344 +0000 UTC m=+859.104302179" watchObservedRunningTime="2025-10-01 16:16:36.105543186 +0000 UTC m=+859.105190021" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.137758 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s" podStartSLOduration=8.194116675 podStartE2EDuration="19.13774509s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.033086498 +0000 UTC m=+842.032733333" lastFinishedPulling="2025-10-01 16:16:29.976714913 +0000 UTC m=+852.976361748" observedRunningTime="2025-10-01 16:16:36.137209997 +0000 UTC m=+859.136856832" watchObservedRunningTime="2025-10-01 16:16:36.13774509 +0000 UTC m=+859.137391925" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.177502 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-lms99" podStartSLOduration=8.705663939 podStartE2EDuration="19.177481s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.434842544 +0000 UTC m=+842.434489379" lastFinishedPulling="2025-10-01 16:16:29.906659605 +0000 UTC m=+852.906306440" observedRunningTime="2025-10-01 16:16:36.176023244 +0000 UTC m=+859.175670079" watchObservedRunningTime="2025-10-01 16:16:36.177481 +0000 UTC m=+859.177127835" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.179104 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-8wqsk" podStartSLOduration=8.170129985 podStartE2EDuration="18.17909151s" podCreationTimestamp="2025-10-01 16:16:18 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.966916197 +0000 UTC m=+842.966563032" lastFinishedPulling="2025-10-01 16:16:29.975877722 +0000 UTC m=+852.975524557" observedRunningTime="2025-10-01 16:16:36.155001896 +0000 UTC m=+859.154648741" watchObservedRunningTime="2025-10-01 16:16:36.17909151 +0000 UTC m=+859.178738355" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.199301 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-lrzn4" podStartSLOduration=8.614145059 podStartE2EDuration="19.199284748s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.412188392 +0000 UTC m=+842.411835227" lastFinishedPulling="2025-10-01 16:16:29.997328081 +0000 UTC m=+852.996974916" observedRunningTime="2025-10-01 16:16:36.195266269 +0000 UTC m=+859.194913104" watchObservedRunningTime="2025-10-01 16:16:36.199284748 +0000 UTC m=+859.198931583" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.226464 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-62xn4" podStartSLOduration=9.228533639 podStartE2EDuration="19.226445908s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.908624203 +0000 UTC m=+842.908271038" lastFinishedPulling="2025-10-01 16:16:29.906536462 +0000 UTC m=+852.906183307" observedRunningTime="2025-10-01 16:16:36.225460194 +0000 UTC m=+859.225107029" watchObservedRunningTime="2025-10-01 16:16:36.226445908 +0000 UTC m=+859.226092743" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.247087 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b" podStartSLOduration=8.529664452 podStartE2EDuration="19.247071477s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.259943474 +0000 UTC m=+842.259590309" lastFinishedPulling="2025-10-01 16:16:29.977350499 +0000 UTC m=+852.976997334" observedRunningTime="2025-10-01 16:16:36.24478942 +0000 UTC m=+859.244436255" watchObservedRunningTime="2025-10-01 16:16:36.247071477 +0000 UTC m=+859.246718312" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.263097 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-vnmvt" podStartSLOduration=8.546351695 podStartE2EDuration="19.263082932s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.279945399 +0000 UTC m=+842.279592234" lastFinishedPulling="2025-10-01 16:16:29.996676635 +0000 UTC m=+852.996323471" observedRunningTime="2025-10-01 16:16:36.260730704 +0000 UTC m=+859.260377539" watchObservedRunningTime="2025-10-01 16:16:36.263082932 +0000 UTC m=+859.262729767" Oct 01 16:16:36 crc kubenswrapper[4764]: I1001 16:16:36.285112 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd" podStartSLOduration=8.209674435 podStartE2EDuration="18.285097505s" podCreationTimestamp="2025-10-01 16:16:18 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.921235065 +0000 UTC m=+842.920881900" lastFinishedPulling="2025-10-01 16:16:29.996658135 +0000 UTC m=+852.996304970" observedRunningTime="2025-10-01 16:16:36.284354817 +0000 UTC m=+859.284001652" watchObservedRunningTime="2025-10-01 16:16:36.285097505 +0000 UTC m=+859.284744340" Oct 01 16:16:38 crc kubenswrapper[4764]: I1001 16:16:38.019697 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-9vv9s" Oct 01 16:16:38 crc kubenswrapper[4764]: I1001 16:16:38.025084 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6" Oct 01 16:16:38 crc kubenswrapper[4764]: I1001 16:16:38.789208 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-jlwbd" Oct 01 16:16:39 crc kubenswrapper[4764]: I1001 16:16:39.999181 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-bwj7p" Oct 01 16:16:40 crc kubenswrapper[4764]: I1001 16:16:40.061117 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" event={"ID":"29eab2f9-7ef6-4cc6-9f45-af32a4071a5d","Type":"ContainerStarted","Data":"d4bbdec2b8ed52521691514a65b48d1f8117ddb3da562252db8704313328dc64"} Oct 01 16:16:40 crc kubenswrapper[4764]: I1001 16:16:40.061475 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" Oct 01 16:16:40 crc kubenswrapper[4764]: I1001 16:16:40.063459 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" event={"ID":"33cb2692-6fcf-4af5-bf43-697a4a740c19","Type":"ContainerStarted","Data":"7a767aeeca25e1dd10c70bd7dbf86265a5ef346c66c078bc419db6024e382c48"} Oct 01 16:16:40 crc kubenswrapper[4764]: I1001 16:16:40.063729 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" Oct 01 16:16:40 crc kubenswrapper[4764]: I1001 16:16:40.111707 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" podStartSLOduration=2.442418447 podStartE2EDuration="22.111667383s" podCreationTimestamp="2025-10-01 16:16:18 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.976459252 +0000 UTC m=+842.976106087" lastFinishedPulling="2025-10-01 16:16:39.645708188 +0000 UTC m=+862.645355023" observedRunningTime="2025-10-01 16:16:40.108546576 +0000 UTC m=+863.108193421" watchObservedRunningTime="2025-10-01 16:16:40.111667383 +0000 UTC m=+863.111314228" Oct 01 16:16:40 crc kubenswrapper[4764]: I1001 16:16:40.112997 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" podStartSLOduration=2.396025383 podStartE2EDuration="22.112989476s" podCreationTimestamp="2025-10-01 16:16:18 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.977282693 +0000 UTC m=+842.976929528" lastFinishedPulling="2025-10-01 16:16:39.694246786 +0000 UTC m=+862.693893621" observedRunningTime="2025-10-01 16:16:40.09329902 +0000 UTC m=+863.092945885" watchObservedRunningTime="2025-10-01 16:16:40.112989476 +0000 UTC m=+863.112636321" Oct 01 16:16:41 crc kubenswrapper[4764]: I1001 16:16:41.085491 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" event={"ID":"e55f7c89-8011-437e-bcbc-b19ae9e25acd","Type":"ContainerStarted","Data":"9dd7ef409c58e86dff2ee2aad0f5b6ad3ba33f77d762fc96d0050818970ddd71"} Oct 01 16:16:41 crc kubenswrapper[4764]: I1001 16:16:41.086581 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" Oct 01 16:16:41 crc kubenswrapper[4764]: I1001 16:16:41.090281 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" event={"ID":"705820be-248f-49fb-9ace-0f333674985a","Type":"ContainerStarted","Data":"8b97cd3d44ab03c251feddfed753153d1213b2de2b3a635df03848d69331cbb7"} Oct 01 16:16:41 crc kubenswrapper[4764]: I1001 16:16:41.090889 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" Oct 01 16:16:41 crc kubenswrapper[4764]: I1001 16:16:41.094483 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" event={"ID":"53b1bb68-341f-4635-8339-ff10c9b08dee","Type":"ContainerStarted","Data":"ab97cdcc4988c2365d8fd809119e504a3979fce6af089a7baaf0338bb4ad7218"} Oct 01 16:16:41 crc kubenswrapper[4764]: I1001 16:16:41.094672 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" Oct 01 16:16:41 crc kubenswrapper[4764]: I1001 16:16:41.097455 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp" event={"ID":"9bb8b56a-c568-4ea4-985e-a80d49b61197","Type":"ContainerStarted","Data":"121ae2149d97cc9192b9ed13ddda50cfa3da4d9a101ee236612592d253262370"} Oct 01 16:16:41 crc kubenswrapper[4764]: I1001 16:16:41.113166 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" podStartSLOduration=3.4505486960000002 podStartE2EDuration="23.113148978s" podCreationTimestamp="2025-10-01 16:16:18 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.983480056 +0000 UTC m=+842.983126891" lastFinishedPulling="2025-10-01 16:16:39.646080338 +0000 UTC m=+862.645727173" observedRunningTime="2025-10-01 16:16:41.107016537 +0000 UTC m=+864.106663402" watchObservedRunningTime="2025-10-01 16:16:41.113148978 +0000 UTC m=+864.112795823" Oct 01 16:16:41 crc kubenswrapper[4764]: I1001 16:16:41.137358 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" podStartSLOduration=3.509082974 podStartE2EDuration="23.137336436s" podCreationTimestamp="2025-10-01 16:16:18 +0000 UTC" firstStartedPulling="2025-10-01 16:16:20.011584052 +0000 UTC m=+843.011230887" lastFinishedPulling="2025-10-01 16:16:39.639837514 +0000 UTC m=+862.639484349" observedRunningTime="2025-10-01 16:16:41.136109255 +0000 UTC m=+864.135756120" watchObservedRunningTime="2025-10-01 16:16:41.137336436 +0000 UTC m=+864.136983281" Oct 01 16:16:41 crc kubenswrapper[4764]: I1001 16:16:41.159317 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" podStartSLOduration=4.488753579 podStartE2EDuration="24.159291577s" podCreationTimestamp="2025-10-01 16:16:17 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.975353836 +0000 UTC m=+842.975000671" lastFinishedPulling="2025-10-01 16:16:39.645891784 +0000 UTC m=+862.645538669" observedRunningTime="2025-10-01 16:16:41.152420558 +0000 UTC m=+864.152067433" watchObservedRunningTime="2025-10-01 16:16:41.159291577 +0000 UTC m=+864.158938452" Oct 01 16:16:41 crc kubenswrapper[4764]: I1001 16:16:41.175372 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp" podStartSLOduration=3.502809016 podStartE2EDuration="23.175345933s" podCreationTimestamp="2025-10-01 16:16:18 +0000 UTC" firstStartedPulling="2025-10-01 16:16:19.983324702 +0000 UTC m=+842.982971537" lastFinishedPulling="2025-10-01 16:16:39.655861619 +0000 UTC m=+862.655508454" observedRunningTime="2025-10-01 16:16:41.170605857 +0000 UTC m=+864.170252722" watchObservedRunningTime="2025-10-01 16:16:41.175345933 +0000 UTC m=+864.174992808" Oct 01 16:16:47 crc kubenswrapper[4764]: I1001 16:16:47.981019 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-4724b" Oct 01 16:16:48 crc kubenswrapper[4764]: I1001 16:16:48.015835 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-27n67" Oct 01 16:16:48 crc kubenswrapper[4764]: I1001 16:16:48.156897 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-g528r" Oct 01 16:16:48 crc kubenswrapper[4764]: I1001 16:16:48.377424 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-lms99" Oct 01 16:16:48 crc kubenswrapper[4764]: I1001 16:16:48.411748 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-z9988" Oct 01 16:16:48 crc kubenswrapper[4764]: I1001 16:16:48.542212 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-fg9hc" Oct 01 16:16:48 crc kubenswrapper[4764]: I1001 16:16:48.543726 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5b67755477-8xdpz" Oct 01 16:16:48 crc kubenswrapper[4764]: I1001 16:16:48.573616 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-vnlvl" Oct 01 16:16:48 crc kubenswrapper[4764]: I1001 16:16:48.706181 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-rkr2b" Oct 01 16:16:48 crc kubenswrapper[4764]: I1001 16:16:48.850461 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-85777745bb-q6nwl" Oct 01 16:16:48 crc kubenswrapper[4764]: I1001 16:16:48.861627 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-vbgxb" Oct 01 16:16:48 crc kubenswrapper[4764]: I1001 16:16:48.908250 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-f5nmv" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.519286 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tf6fb"] Oct 01 16:17:04 crc kubenswrapper[4764]: E1001 16:17:04.520158 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ced474e-2212-4ae4-b305-fbc1f0e05a93" containerName="extract-utilities" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.520175 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ced474e-2212-4ae4-b305-fbc1f0e05a93" containerName="extract-utilities" Oct 01 16:17:04 crc kubenswrapper[4764]: E1001 16:17:04.520214 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ced474e-2212-4ae4-b305-fbc1f0e05a93" containerName="extract-content" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.520223 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ced474e-2212-4ae4-b305-fbc1f0e05a93" containerName="extract-content" Oct 01 16:17:04 crc kubenswrapper[4764]: E1001 16:17:04.520252 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ced474e-2212-4ae4-b305-fbc1f0e05a93" containerName="registry-server" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.520261 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ced474e-2212-4ae4-b305-fbc1f0e05a93" containerName="registry-server" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.520454 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ced474e-2212-4ae4-b305-fbc1f0e05a93" containerName="registry-server" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.521367 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tf6fb" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.526172 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-x2knq" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.526469 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.526625 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.526809 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.535196 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tf6fb"] Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.583216 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gjb2d"] Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.584318 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.596218 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.605631 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxk28\" (UniqueName: \"kubernetes.io/projected/04a6e396-f88e-4a43-a3c2-b6f951b87a77-kube-api-access-qxk28\") pod \"dnsmasq-dns-675f4bcbfc-tf6fb\" (UID: \"04a6e396-f88e-4a43-a3c2-b6f951b87a77\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tf6fb" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.605743 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a6e396-f88e-4a43-a3c2-b6f951b87a77-config\") pod \"dnsmasq-dns-675f4bcbfc-tf6fb\" (UID: \"04a6e396-f88e-4a43-a3c2-b6f951b87a77\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tf6fb" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.618244 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gjb2d"] Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.706731 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e788c14c-6dea-49ca-a317-0ea8fd947371-config\") pod \"dnsmasq-dns-78dd6ddcc-gjb2d\" (UID: \"e788c14c-6dea-49ca-a317-0ea8fd947371\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.706780 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e788c14c-6dea-49ca-a317-0ea8fd947371-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-gjb2d\" (UID: \"e788c14c-6dea-49ca-a317-0ea8fd947371\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.706860 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a6e396-f88e-4a43-a3c2-b6f951b87a77-config\") pod \"dnsmasq-dns-675f4bcbfc-tf6fb\" (UID: \"04a6e396-f88e-4a43-a3c2-b6f951b87a77\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tf6fb" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.706890 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj5zm\" (UniqueName: \"kubernetes.io/projected/e788c14c-6dea-49ca-a317-0ea8fd947371-kube-api-access-qj5zm\") pod \"dnsmasq-dns-78dd6ddcc-gjb2d\" (UID: \"e788c14c-6dea-49ca-a317-0ea8fd947371\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.706968 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxk28\" (UniqueName: \"kubernetes.io/projected/04a6e396-f88e-4a43-a3c2-b6f951b87a77-kube-api-access-qxk28\") pod \"dnsmasq-dns-675f4bcbfc-tf6fb\" (UID: \"04a6e396-f88e-4a43-a3c2-b6f951b87a77\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tf6fb" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.707720 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a6e396-f88e-4a43-a3c2-b6f951b87a77-config\") pod \"dnsmasq-dns-675f4bcbfc-tf6fb\" (UID: \"04a6e396-f88e-4a43-a3c2-b6f951b87a77\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tf6fb" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.733979 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxk28\" (UniqueName: \"kubernetes.io/projected/04a6e396-f88e-4a43-a3c2-b6f951b87a77-kube-api-access-qxk28\") pod \"dnsmasq-dns-675f4bcbfc-tf6fb\" (UID: \"04a6e396-f88e-4a43-a3c2-b6f951b87a77\") " pod="openstack/dnsmasq-dns-675f4bcbfc-tf6fb" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.808118 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e788c14c-6dea-49ca-a317-0ea8fd947371-config\") pod \"dnsmasq-dns-78dd6ddcc-gjb2d\" (UID: \"e788c14c-6dea-49ca-a317-0ea8fd947371\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.808156 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e788c14c-6dea-49ca-a317-0ea8fd947371-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-gjb2d\" (UID: \"e788c14c-6dea-49ca-a317-0ea8fd947371\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.808189 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj5zm\" (UniqueName: \"kubernetes.io/projected/e788c14c-6dea-49ca-a317-0ea8fd947371-kube-api-access-qj5zm\") pod \"dnsmasq-dns-78dd6ddcc-gjb2d\" (UID: \"e788c14c-6dea-49ca-a317-0ea8fd947371\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.809197 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e788c14c-6dea-49ca-a317-0ea8fd947371-config\") pod \"dnsmasq-dns-78dd6ddcc-gjb2d\" (UID: \"e788c14c-6dea-49ca-a317-0ea8fd947371\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.809199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e788c14c-6dea-49ca-a317-0ea8fd947371-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-gjb2d\" (UID: \"e788c14c-6dea-49ca-a317-0ea8fd947371\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.824024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj5zm\" (UniqueName: \"kubernetes.io/projected/e788c14c-6dea-49ca-a317-0ea8fd947371-kube-api-access-qj5zm\") pod \"dnsmasq-dns-78dd6ddcc-gjb2d\" (UID: \"e788c14c-6dea-49ca-a317-0ea8fd947371\") " pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.844732 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tf6fb" Oct 01 16:17:04 crc kubenswrapper[4764]: I1001 16:17:04.910008 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" Oct 01 16:17:05 crc kubenswrapper[4764]: I1001 16:17:05.311091 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tf6fb"] Oct 01 16:17:05 crc kubenswrapper[4764]: W1001 16:17:05.320332 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04a6e396_f88e_4a43_a3c2_b6f951b87a77.slice/crio-a03ed4fd73aa9fe1b1a1e74e3d4984c8b9e25a49368ed244e8fd22aad74f1aeb WatchSource:0}: Error finding container a03ed4fd73aa9fe1b1a1e74e3d4984c8b9e25a49368ed244e8fd22aad74f1aeb: Status 404 returned error can't find the container with id a03ed4fd73aa9fe1b1a1e74e3d4984c8b9e25a49368ed244e8fd22aad74f1aeb Oct 01 16:17:05 crc kubenswrapper[4764]: I1001 16:17:05.379297 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gjb2d"] Oct 01 16:17:05 crc kubenswrapper[4764]: W1001 16:17:05.380386 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode788c14c_6dea_49ca_a317_0ea8fd947371.slice/crio-602bea37657cce1bdcc4c942714a6047e6e05405dbe6271f8260cee0aae9848a WatchSource:0}: Error finding container 602bea37657cce1bdcc4c942714a6047e6e05405dbe6271f8260cee0aae9848a: Status 404 returned error can't find the container with id 602bea37657cce1bdcc4c942714a6047e6e05405dbe6271f8260cee0aae9848a Oct 01 16:17:06 crc kubenswrapper[4764]: I1001 16:17:06.323860 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-tf6fb" event={"ID":"04a6e396-f88e-4a43-a3c2-b6f951b87a77","Type":"ContainerStarted","Data":"a03ed4fd73aa9fe1b1a1e74e3d4984c8b9e25a49368ed244e8fd22aad74f1aeb"} Oct 01 16:17:06 crc kubenswrapper[4764]: I1001 16:17:06.325567 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" event={"ID":"e788c14c-6dea-49ca-a317-0ea8fd947371","Type":"ContainerStarted","Data":"602bea37657cce1bdcc4c942714a6047e6e05405dbe6271f8260cee0aae9848a"} Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.519367 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tf6fb"] Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.533768 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nsgv9"] Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.535152 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.543336 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nsgv9"] Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.648106 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-config\") pod \"dnsmasq-dns-5ccc8479f9-nsgv9\" (UID: \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.648167 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nsgv9\" (UID: \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.648222 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpt2d\" (UniqueName: \"kubernetes.io/projected/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-kube-api-access-bpt2d\") pod \"dnsmasq-dns-5ccc8479f9-nsgv9\" (UID: \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.749517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-config\") pod \"dnsmasq-dns-5ccc8479f9-nsgv9\" (UID: \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.749642 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nsgv9\" (UID: \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.749704 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpt2d\" (UniqueName: \"kubernetes.io/projected/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-kube-api-access-bpt2d\") pod \"dnsmasq-dns-5ccc8479f9-nsgv9\" (UID: \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.753900 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-config\") pod \"dnsmasq-dns-5ccc8479f9-nsgv9\" (UID: \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.754532 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nsgv9\" (UID: \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.786417 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpt2d\" (UniqueName: \"kubernetes.io/projected/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-kube-api-access-bpt2d\") pod \"dnsmasq-dns-5ccc8479f9-nsgv9\" (UID: \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.855373 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gjb2d"] Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.857877 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.879473 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4b9c6"] Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.880764 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.888512 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4b9c6"] Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.954787 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/846f753c-873c-4b5d-a31f-1bbc13cafb80-config\") pod \"dnsmasq-dns-57d769cc4f-4b9c6\" (UID: \"846f753c-873c-4b5d-a31f-1bbc13cafb80\") " pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.954822 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/846f753c-873c-4b5d-a31f-1bbc13cafb80-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4b9c6\" (UID: \"846f753c-873c-4b5d-a31f-1bbc13cafb80\") " pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:17:07 crc kubenswrapper[4764]: I1001 16:17:07.954851 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnpxk\" (UniqueName: \"kubernetes.io/projected/846f753c-873c-4b5d-a31f-1bbc13cafb80-kube-api-access-jnpxk\") pod \"dnsmasq-dns-57d769cc4f-4b9c6\" (UID: \"846f753c-873c-4b5d-a31f-1bbc13cafb80\") " pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.056905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/846f753c-873c-4b5d-a31f-1bbc13cafb80-config\") pod \"dnsmasq-dns-57d769cc4f-4b9c6\" (UID: \"846f753c-873c-4b5d-a31f-1bbc13cafb80\") " pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.057152 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/846f753c-873c-4b5d-a31f-1bbc13cafb80-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4b9c6\" (UID: \"846f753c-873c-4b5d-a31f-1bbc13cafb80\") " pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.057198 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnpxk\" (UniqueName: \"kubernetes.io/projected/846f753c-873c-4b5d-a31f-1bbc13cafb80-kube-api-access-jnpxk\") pod \"dnsmasq-dns-57d769cc4f-4b9c6\" (UID: \"846f753c-873c-4b5d-a31f-1bbc13cafb80\") " pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.059470 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/846f753c-873c-4b5d-a31f-1bbc13cafb80-config\") pod \"dnsmasq-dns-57d769cc4f-4b9c6\" (UID: \"846f753c-873c-4b5d-a31f-1bbc13cafb80\") " pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.059900 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/846f753c-873c-4b5d-a31f-1bbc13cafb80-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4b9c6\" (UID: \"846f753c-873c-4b5d-a31f-1bbc13cafb80\") " pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.092412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnpxk\" (UniqueName: \"kubernetes.io/projected/846f753c-873c-4b5d-a31f-1bbc13cafb80-kube-api-access-jnpxk\") pod \"dnsmasq-dns-57d769cc4f-4b9c6\" (UID: \"846f753c-873c-4b5d-a31f-1bbc13cafb80\") " pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.272092 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.347254 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nsgv9"] Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.690709 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.691848 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.693724 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.694250 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.695395 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.695704 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.695869 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.696916 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.696916 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xv7cg" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.701843 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4b9c6"] Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.707464 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.767692 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.767754 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.767787 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq9ph\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-kube-api-access-fq9ph\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.767813 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.767836 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.767856 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/479f4015-9972-4350-bac3-6292b0c962ec-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.767874 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.767887 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.767922 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.768248 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/479f4015-9972-4350-bac3-6292b0c962ec-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.768328 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.870003 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.870355 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.870390 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/479f4015-9972-4350-bac3-6292b0c962ec-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.870416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.870439 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.870475 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.870502 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/479f4015-9972-4350-bac3-6292b0c962ec-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.870536 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.870591 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.870633 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.870658 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq9ph\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-kube-api-access-fq9ph\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.870760 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.871782 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.872111 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.872255 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.872688 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.872666 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.877953 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/479f4015-9972-4350-bac3-6292b0c962ec-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.879387 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.880135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.883840 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/479f4015-9972-4350-bac3-6292b0c962ec-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.890027 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq9ph\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-kube-api-access-fq9ph\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.890455 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.958746 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.960076 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.963148 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5v9t9" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.963165 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.963265 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.963292 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.963490 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.964253 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.965385 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 16:17:08 crc kubenswrapper[4764]: I1001 16:17:08.969392 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.022840 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.090692 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.090790 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.090839 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.090866 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.090891 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkfzz\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-kube-api-access-gkfzz\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.090917 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8bf385ea-f77a-4773-9c0c-e57f611707db-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.090945 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.090988 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.091012 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8bf385ea-f77a-4773-9c0c-e57f611707db-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.091040 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-config-data\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.091086 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.191881 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.191944 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.191973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.191998 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkfzz\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-kube-api-access-gkfzz\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.192023 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8bf385ea-f77a-4773-9c0c-e57f611707db-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.192110 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.192148 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.192169 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8bf385ea-f77a-4773-9c0c-e57f611707db-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.192199 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-config-data\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.192217 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.192260 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.192999 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.194260 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.194568 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.196070 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-server-conf\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.196352 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-config-data\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.196555 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.200611 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.201467 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8bf385ea-f77a-4773-9c0c-e57f611707db-pod-info\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.204477 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8bf385ea-f77a-4773-9c0c-e57f611707db-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.204911 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.213352 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkfzz\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-kube-api-access-gkfzz\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.213433 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.308885 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 16:17:09 crc kubenswrapper[4764]: I1001 16:17:09.362882 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" event={"ID":"16c6ad23-1b5b-4a4e-bd34-a34240c67a33","Type":"ContainerStarted","Data":"0b1b5eb86e33a2ade12ae0fa47180e9c0a5afa95432774ea4fc43759047a16c9"} Oct 01 16:17:10 crc kubenswrapper[4764]: I1001 16:17:10.697305 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.386722 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" event={"ID":"846f753c-873c-4b5d-a31f-1bbc13cafb80","Type":"ContainerStarted","Data":"6307b728d39f4462f52a969587e9a001f12695b5e4bbf482b9e7b5d9fe817d86"} Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.772343 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.774133 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.781488 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.781681 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-zdncj" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.781956 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.782335 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.782552 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.789095 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.795017 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.843501 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4acbf2ba-c326-445b-b6f6-11458a1dfb68-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.843539 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4acbf2ba-c326-445b-b6f6-11458a1dfb68-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.843590 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4acbf2ba-c326-445b-b6f6-11458a1dfb68-config-data-default\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.843617 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.843635 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79mk7\" (UniqueName: \"kubernetes.io/projected/4acbf2ba-c326-445b-b6f6-11458a1dfb68-kube-api-access-79mk7\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.843649 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4acbf2ba-c326-445b-b6f6-11458a1dfb68-kolla-config\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.843681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4acbf2ba-c326-445b-b6f6-11458a1dfb68-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.843709 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4acbf2ba-c326-445b-b6f6-11458a1dfb68-secrets\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.843723 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4acbf2ba-c326-445b-b6f6-11458a1dfb68-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.908342 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.909499 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.911395 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.911722 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-bqp42" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.911879 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.912144 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.921942 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.945146 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4acbf2ba-c326-445b-b6f6-11458a1dfb68-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.945210 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4acbf2ba-c326-445b-b6f6-11458a1dfb68-secrets\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.945239 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4acbf2ba-c326-445b-b6f6-11458a1dfb68-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.945290 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4acbf2ba-c326-445b-b6f6-11458a1dfb68-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.945311 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4acbf2ba-c326-445b-b6f6-11458a1dfb68-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.945369 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4acbf2ba-c326-445b-b6f6-11458a1dfb68-config-data-default\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.945403 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.945424 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79mk7\" (UniqueName: \"kubernetes.io/projected/4acbf2ba-c326-445b-b6f6-11458a1dfb68-kube-api-access-79mk7\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.945441 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4acbf2ba-c326-445b-b6f6-11458a1dfb68-kolla-config\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.946277 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4acbf2ba-c326-445b-b6f6-11458a1dfb68-kolla-config\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.948719 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4acbf2ba-c326-445b-b6f6-11458a1dfb68-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.949102 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.950110 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4acbf2ba-c326-445b-b6f6-11458a1dfb68-config-data-default\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.950122 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4acbf2ba-c326-445b-b6f6-11458a1dfb68-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.954635 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4acbf2ba-c326-445b-b6f6-11458a1dfb68-secrets\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.962013 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4acbf2ba-c326-445b-b6f6-11458a1dfb68-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.982453 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4acbf2ba-c326-445b-b6f6-11458a1dfb68-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.987237 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:11 crc kubenswrapper[4764]: I1001 16:17:11.988035 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79mk7\" (UniqueName: \"kubernetes.io/projected/4acbf2ba-c326-445b-b6f6-11458a1dfb68-kube-api-access-79mk7\") pod \"openstack-galera-0\" (UID: \"4acbf2ba-c326-445b-b6f6-11458a1dfb68\") " pod="openstack/openstack-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.048702 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.048776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.048793 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.048812 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pkg4\" (UniqueName: \"kubernetes.io/projected/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-kube-api-access-2pkg4\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.048832 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.048848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.048873 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.048895 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.048924 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.105846 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.149659 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.149721 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.149764 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.149803 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.149822 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.149841 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pkg4\" (UniqueName: \"kubernetes.io/projected/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-kube-api-access-2pkg4\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.149860 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.149881 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.149907 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.149997 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.150546 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.150852 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.151534 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.151577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.154003 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.154109 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.155141 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.172851 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pkg4\" (UniqueName: \"kubernetes.io/projected/4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca-kube-api-access-2pkg4\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.203553 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca\") " pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.228324 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.279102 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.279984 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.281646 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.283779 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-292rf" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.288084 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.295422 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.353943 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c346177b-4aeb-43b2-8f86-ce57d0d42c10-kolla-config\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.354007 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c346177b-4aeb-43b2-8f86-ce57d0d42c10-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.354036 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c346177b-4aeb-43b2-8f86-ce57d0d42c10-config-data\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.354114 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szqlc\" (UniqueName: \"kubernetes.io/projected/c346177b-4aeb-43b2-8f86-ce57d0d42c10-kube-api-access-szqlc\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.354151 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c346177b-4aeb-43b2-8f86-ce57d0d42c10-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.455235 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szqlc\" (UniqueName: \"kubernetes.io/projected/c346177b-4aeb-43b2-8f86-ce57d0d42c10-kube-api-access-szqlc\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.455290 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c346177b-4aeb-43b2-8f86-ce57d0d42c10-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.455341 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c346177b-4aeb-43b2-8f86-ce57d0d42c10-kolla-config\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.455380 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c346177b-4aeb-43b2-8f86-ce57d0d42c10-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.455400 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c346177b-4aeb-43b2-8f86-ce57d0d42c10-config-data\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.456191 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c346177b-4aeb-43b2-8f86-ce57d0d42c10-config-data\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.458586 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c346177b-4aeb-43b2-8f86-ce57d0d42c10-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.460764 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c346177b-4aeb-43b2-8f86-ce57d0d42c10-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.472253 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c346177b-4aeb-43b2-8f86-ce57d0d42c10-kolla-config\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.474073 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szqlc\" (UniqueName: \"kubernetes.io/projected/c346177b-4aeb-43b2-8f86-ce57d0d42c10-kube-api-access-szqlc\") pod \"memcached-0\" (UID: \"c346177b-4aeb-43b2-8f86-ce57d0d42c10\") " pod="openstack/memcached-0" Oct 01 16:17:12 crc kubenswrapper[4764]: I1001 16:17:12.598623 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 16:17:14 crc kubenswrapper[4764]: I1001 16:17:14.379665 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 16:17:14 crc kubenswrapper[4764]: I1001 16:17:14.381176 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 16:17:14 crc kubenswrapper[4764]: I1001 16:17:14.385000 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-s298q" Oct 01 16:17:14 crc kubenswrapper[4764]: I1001 16:17:14.391237 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 16:17:14 crc kubenswrapper[4764]: I1001 16:17:14.495528 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqg7p\" (UniqueName: \"kubernetes.io/projected/948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0-kube-api-access-qqg7p\") pod \"kube-state-metrics-0\" (UID: \"948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0\") " pod="openstack/kube-state-metrics-0" Oct 01 16:17:14 crc kubenswrapper[4764]: I1001 16:17:14.596671 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqg7p\" (UniqueName: \"kubernetes.io/projected/948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0-kube-api-access-qqg7p\") pod \"kube-state-metrics-0\" (UID: \"948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0\") " pod="openstack/kube-state-metrics-0" Oct 01 16:17:14 crc kubenswrapper[4764]: I1001 16:17:14.613956 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqg7p\" (UniqueName: \"kubernetes.io/projected/948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0-kube-api-access-qqg7p\") pod \"kube-state-metrics-0\" (UID: \"948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0\") " pod="openstack/kube-state-metrics-0" Oct 01 16:17:14 crc kubenswrapper[4764]: I1001 16:17:14.702973 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.156015 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.159311 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.161332 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.161932 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-fhwvc" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.165202 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.165278 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.165214 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.176690 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.252610 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.252696 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.252724 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.252758 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.252796 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2bft\" (UniqueName: \"kubernetes.io/projected/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-kube-api-access-n2bft\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.252822 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.252906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-config\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.252985 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.354378 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.354471 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.354501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.354532 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.354568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2bft\" (UniqueName: \"kubernetes.io/projected/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-kube-api-access-n2bft\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.354844 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.355004 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.354853 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.355550 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-config\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.355623 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.356357 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.356849 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-config\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.359361 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.367737 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.367921 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.377186 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2bft\" (UniqueName: \"kubernetes.io/projected/e870edc1-ed3c-4c16-8c20-cde661ac4ce0-kube-api-access-n2bft\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.383747 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e870edc1-ed3c-4c16-8c20-cde661ac4ce0\") " pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.408762 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wjjhq"] Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.409864 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.413163 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.413275 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-sg7vk" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.413420 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.428245 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-cfp4k"] Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.429887 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.435901 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wjjhq"] Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.449043 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cfp4k"] Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.457056 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79962852-f159-44df-bd50-38928f3df91d-var-log-ovn\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.457103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79962852-f159-44df-bd50-38928f3df91d-combined-ca-bundle\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.457129 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/79962852-f159-44df-bd50-38928f3df91d-ovn-controller-tls-certs\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.457163 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79962852-f159-44df-bd50-38928f3df91d-scripts\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.457179 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79962852-f159-44df-bd50-38928f3df91d-var-run-ovn\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.457216 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79962852-f159-44df-bd50-38928f3df91d-var-run\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.457237 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq5tw\" (UniqueName: \"kubernetes.io/projected/79962852-f159-44df-bd50-38928f3df91d-kube-api-access-dq5tw\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.491551 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.558712 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/da235e7c-70c7-4e8e-bf34-260bfc0cb986-var-lib\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.558763 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da235e7c-70c7-4e8e-bf34-260bfc0cb986-var-run\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.558795 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79962852-f159-44df-bd50-38928f3df91d-var-log-ovn\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.558896 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79962852-f159-44df-bd50-38928f3df91d-combined-ca-bundle\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.558998 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/79962852-f159-44df-bd50-38928f3df91d-ovn-controller-tls-certs\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.559321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/79962852-f159-44df-bd50-38928f3df91d-var-log-ovn\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.559622 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj6nf\" (UniqueName: \"kubernetes.io/projected/da235e7c-70c7-4e8e-bf34-260bfc0cb986-kube-api-access-rj6nf\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.559661 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79962852-f159-44df-bd50-38928f3df91d-scripts\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.559683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79962852-f159-44df-bd50-38928f3df91d-var-run-ovn\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.559719 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da235e7c-70c7-4e8e-bf34-260bfc0cb986-scripts\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.559757 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/da235e7c-70c7-4e8e-bf34-260bfc0cb986-etc-ovs\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.559822 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79962852-f159-44df-bd50-38928f3df91d-var-run\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.559866 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq5tw\" (UniqueName: \"kubernetes.io/projected/79962852-f159-44df-bd50-38928f3df91d-kube-api-access-dq5tw\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.559912 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/da235e7c-70c7-4e8e-bf34-260bfc0cb986-var-log\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.561179 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/79962852-f159-44df-bd50-38928f3df91d-var-run-ovn\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.561285 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/79962852-f159-44df-bd50-38928f3df91d-var-run\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.568001 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79962852-f159-44df-bd50-38928f3df91d-combined-ca-bundle\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.568374 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/79962852-f159-44df-bd50-38928f3df91d-ovn-controller-tls-certs\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.570574 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79962852-f159-44df-bd50-38928f3df91d-scripts\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.587655 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq5tw\" (UniqueName: \"kubernetes.io/projected/79962852-f159-44df-bd50-38928f3df91d-kube-api-access-dq5tw\") pod \"ovn-controller-wjjhq\" (UID: \"79962852-f159-44df-bd50-38928f3df91d\") " pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.661518 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/da235e7c-70c7-4e8e-bf34-260bfc0cb986-var-lib\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.661562 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da235e7c-70c7-4e8e-bf34-260bfc0cb986-var-run\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.661615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj6nf\" (UniqueName: \"kubernetes.io/projected/da235e7c-70c7-4e8e-bf34-260bfc0cb986-kube-api-access-rj6nf\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.662493 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da235e7c-70c7-4e8e-bf34-260bfc0cb986-scripts\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.662528 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/da235e7c-70c7-4e8e-bf34-260bfc0cb986-etc-ovs\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.662567 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/da235e7c-70c7-4e8e-bf34-260bfc0cb986-var-log\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.662766 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/da235e7c-70c7-4e8e-bf34-260bfc0cb986-var-log\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.662888 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/da235e7c-70c7-4e8e-bf34-260bfc0cb986-var-lib\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.662931 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da235e7c-70c7-4e8e-bf34-260bfc0cb986-var-run\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.666266 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da235e7c-70c7-4e8e-bf34-260bfc0cb986-scripts\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.666340 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/da235e7c-70c7-4e8e-bf34-260bfc0cb986-etc-ovs\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.679000 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj6nf\" (UniqueName: \"kubernetes.io/projected/da235e7c-70c7-4e8e-bf34-260bfc0cb986-kube-api-access-rj6nf\") pod \"ovn-controller-ovs-cfp4k\" (UID: \"da235e7c-70c7-4e8e-bf34-260bfc0cb986\") " pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.757719 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:18 crc kubenswrapper[4764]: I1001 16:17:18.767696 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:20 crc kubenswrapper[4764]: E1001 16:17:20.926853 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 16:17:20 crc kubenswrapper[4764]: E1001 16:17:20.927563 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qj5zm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-gjb2d_openstack(e788c14c-6dea-49ca-a317-0ea8fd947371): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 16:17:20 crc kubenswrapper[4764]: E1001 16:17:20.929306 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" podUID="e788c14c-6dea-49ca-a317-0ea8fd947371" Oct 01 16:17:21 crc kubenswrapper[4764]: E1001 16:17:21.037362 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 16:17:21 crc kubenswrapper[4764]: E1001 16:17:21.037594 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qxk28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-tf6fb_openstack(04a6e396-f88e-4a43-a3c2-b6f951b87a77): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 16:17:21 crc kubenswrapper[4764]: E1001 16:17:21.038868 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-tf6fb" podUID="04a6e396-f88e-4a43-a3c2-b6f951b87a77" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.463592 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.465998 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.468301 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-z6ntx" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.468588 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.468755 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.468900 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.476244 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.491893 4764 generic.go:334] "Generic (PLEG): container finished" podID="846f753c-873c-4b5d-a31f-1bbc13cafb80" containerID="2adbe3afd8b5b27f6eb158018480410c8dd0d579f3b1852aa09d96d506304f0c" exitCode=0 Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.491950 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" event={"ID":"846f753c-873c-4b5d-a31f-1bbc13cafb80","Type":"ContainerDied","Data":"2adbe3afd8b5b27f6eb158018480410c8dd0d579f3b1852aa09d96d506304f0c"} Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.494235 4764 generic.go:334] "Generic (PLEG): container finished" podID="16c6ad23-1b5b-4a4e-bd34-a34240c67a33" containerID="9f8dc157fa5e56de70b3e2bbee80639abf3a1c62bc93739f03342edeba72f318" exitCode=0 Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.494337 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" event={"ID":"16c6ad23-1b5b-4a4e-bd34-a34240c67a33","Type":"ContainerDied","Data":"9f8dc157fa5e56de70b3e2bbee80639abf3a1c62bc93739f03342edeba72f318"} Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.514690 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/923cf4a9-e116-4a84-ae06-dec150a649bc-config\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.514764 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.514783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/923cf4a9-e116-4a84-ae06-dec150a649bc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.514802 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/923cf4a9-e116-4a84-ae06-dec150a649bc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.514827 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/923cf4a9-e116-4a84-ae06-dec150a649bc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.514861 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/923cf4a9-e116-4a84-ae06-dec150a649bc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.514882 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8hlg\" (UniqueName: \"kubernetes.io/projected/923cf4a9-e116-4a84-ae06-dec150a649bc-kube-api-access-f8hlg\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.514902 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/923cf4a9-e116-4a84-ae06-dec150a649bc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.546603 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.626042 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/923cf4a9-e116-4a84-ae06-dec150a649bc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.626451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/923cf4a9-e116-4a84-ae06-dec150a649bc-config\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.626517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.626536 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/923cf4a9-e116-4a84-ae06-dec150a649bc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.626553 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/923cf4a9-e116-4a84-ae06-dec150a649bc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.626573 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/923cf4a9-e116-4a84-ae06-dec150a649bc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.626622 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/923cf4a9-e116-4a84-ae06-dec150a649bc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.626644 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8hlg\" (UniqueName: \"kubernetes.io/projected/923cf4a9-e116-4a84-ae06-dec150a649bc-kube-api-access-f8hlg\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.629296 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.635008 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/923cf4a9-e116-4a84-ae06-dec150a649bc-config\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.635523 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/923cf4a9-e116-4a84-ae06-dec150a649bc-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.635873 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/923cf4a9-e116-4a84-ae06-dec150a649bc-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.639100 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/923cf4a9-e116-4a84-ae06-dec150a649bc-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.640279 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.641958 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/923cf4a9-e116-4a84-ae06-dec150a649bc-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.645875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/923cf4a9-e116-4a84-ae06-dec150a649bc-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.649129 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8hlg\" (UniqueName: \"kubernetes.io/projected/923cf4a9-e116-4a84-ae06-dec150a649bc-kube-api-access-f8hlg\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.656914 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.658530 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"923cf4a9-e116-4a84-ae06-dec150a649bc\") " pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.676117 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 16:17:21 crc kubenswrapper[4764]: E1001 16:17:21.736590 4764 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 01 16:17:21 crc kubenswrapper[4764]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/16c6ad23-1b5b-4a4e-bd34-a34240c67a33/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 01 16:17:21 crc kubenswrapper[4764]: > podSandboxID="0b1b5eb86e33a2ade12ae0fa47180e9c0a5afa95432774ea4fc43759047a16c9" Oct 01 16:17:21 crc kubenswrapper[4764]: E1001 16:17:21.736750 4764 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 01 16:17:21 crc kubenswrapper[4764]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bpt2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-nsgv9_openstack(16c6ad23-1b5b-4a4e-bd34-a34240c67a33): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/16c6ad23-1b5b-4a4e-bd34-a34240c67a33/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 01 16:17:21 crc kubenswrapper[4764]: > logger="UnhandledError" Oct 01 16:17:21 crc kubenswrapper[4764]: E1001 16:17:21.740162 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/16c6ad23-1b5b-4a4e-bd34-a34240c67a33/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" podUID="16c6ad23-1b5b-4a4e-bd34-a34240c67a33" Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.780340 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.785809 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.854255 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:21 crc kubenswrapper[4764]: W1001 16:17:21.926251 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode870edc1_ed3c_4c16_8c20_cde661ac4ce0.slice/crio-af519d8a1dd1abc8b6e419aac61ec35641a28f4b41681ed0cc50d2d66a5c4e5a WatchSource:0}: Error finding container af519d8a1dd1abc8b6e419aac61ec35641a28f4b41681ed0cc50d2d66a5c4e5a: Status 404 returned error can't find the container with id af519d8a1dd1abc8b6e419aac61ec35641a28f4b41681ed0cc50d2d66a5c4e5a Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.928881 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 16:17:21 crc kubenswrapper[4764]: I1001 16:17:21.933498 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wjjhq"] Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.025330 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cfp4k"] Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.034384 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" Oct 01 16:17:22 crc kubenswrapper[4764]: W1001 16:17:22.053578 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda235e7c_70c7_4e8e_bf34_260bfc0cb986.slice/crio-5548010aff310ebaba522494083f9b4eb2b4b64ee85e8ad756d7f817bff9e1a5 WatchSource:0}: Error finding container 5548010aff310ebaba522494083f9b4eb2b4b64ee85e8ad756d7f817bff9e1a5: Status 404 returned error can't find the container with id 5548010aff310ebaba522494083f9b4eb2b4b64ee85e8ad756d7f817bff9e1a5 Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.075932 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tf6fb" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.133505 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a6e396-f88e-4a43-a3c2-b6f951b87a77-config\") pod \"04a6e396-f88e-4a43-a3c2-b6f951b87a77\" (UID: \"04a6e396-f88e-4a43-a3c2-b6f951b87a77\") " Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.133630 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxk28\" (UniqueName: \"kubernetes.io/projected/04a6e396-f88e-4a43-a3c2-b6f951b87a77-kube-api-access-qxk28\") pod \"04a6e396-f88e-4a43-a3c2-b6f951b87a77\" (UID: \"04a6e396-f88e-4a43-a3c2-b6f951b87a77\") " Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.133701 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj5zm\" (UniqueName: \"kubernetes.io/projected/e788c14c-6dea-49ca-a317-0ea8fd947371-kube-api-access-qj5zm\") pod \"e788c14c-6dea-49ca-a317-0ea8fd947371\" (UID: \"e788c14c-6dea-49ca-a317-0ea8fd947371\") " Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.133742 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e788c14c-6dea-49ca-a317-0ea8fd947371-dns-svc\") pod \"e788c14c-6dea-49ca-a317-0ea8fd947371\" (UID: \"e788c14c-6dea-49ca-a317-0ea8fd947371\") " Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.133760 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e788c14c-6dea-49ca-a317-0ea8fd947371-config\") pod \"e788c14c-6dea-49ca-a317-0ea8fd947371\" (UID: \"e788c14c-6dea-49ca-a317-0ea8fd947371\") " Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.134650 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e788c14c-6dea-49ca-a317-0ea8fd947371-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e788c14c-6dea-49ca-a317-0ea8fd947371" (UID: "e788c14c-6dea-49ca-a317-0ea8fd947371"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.134676 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e788c14c-6dea-49ca-a317-0ea8fd947371-config" (OuterVolumeSpecName: "config") pod "e788c14c-6dea-49ca-a317-0ea8fd947371" (UID: "e788c14c-6dea-49ca-a317-0ea8fd947371"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.134952 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a6e396-f88e-4a43-a3c2-b6f951b87a77-config" (OuterVolumeSpecName: "config") pod "04a6e396-f88e-4a43-a3c2-b6f951b87a77" (UID: "04a6e396-f88e-4a43-a3c2-b6f951b87a77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.141303 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a6e396-f88e-4a43-a3c2-b6f951b87a77-kube-api-access-qxk28" (OuterVolumeSpecName: "kube-api-access-qxk28") pod "04a6e396-f88e-4a43-a3c2-b6f951b87a77" (UID: "04a6e396-f88e-4a43-a3c2-b6f951b87a77"). InnerVolumeSpecName "kube-api-access-qxk28". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.141359 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e788c14c-6dea-49ca-a317-0ea8fd947371-kube-api-access-qj5zm" (OuterVolumeSpecName: "kube-api-access-qj5zm") pod "e788c14c-6dea-49ca-a317-0ea8fd947371" (UID: "e788c14c-6dea-49ca-a317-0ea8fd947371"). InnerVolumeSpecName "kube-api-access-qj5zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.236427 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04a6e396-f88e-4a43-a3c2-b6f951b87a77-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.236475 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxk28\" (UniqueName: \"kubernetes.io/projected/04a6e396-f88e-4a43-a3c2-b6f951b87a77-kube-api-access-qxk28\") on node \"crc\" DevicePath \"\"" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.236495 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj5zm\" (UniqueName: \"kubernetes.io/projected/e788c14c-6dea-49ca-a317-0ea8fd947371-kube-api-access-qj5zm\") on node \"crc\" DevicePath \"\"" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.236544 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e788c14c-6dea-49ca-a317-0ea8fd947371-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.236562 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e788c14c-6dea-49ca-a317-0ea8fd947371-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.384795 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.509957 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"479f4015-9972-4350-bac3-6292b0c962ec","Type":"ContainerStarted","Data":"2162ff10b4ebcba6b6f0773dc5dd0ec1a6384c8fe386119983bc7779045a2dcd"} Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.513932 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e870edc1-ed3c-4c16-8c20-cde661ac4ce0","Type":"ContainerStarted","Data":"af519d8a1dd1abc8b6e419aac61ec35641a28f4b41681ed0cc50d2d66a5c4e5a"} Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.515394 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-tf6fb" event={"ID":"04a6e396-f88e-4a43-a3c2-b6f951b87a77","Type":"ContainerDied","Data":"a03ed4fd73aa9fe1b1a1e74e3d4984c8b9e25a49368ed244e8fd22aad74f1aeb"} Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.515423 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-tf6fb" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.519226 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0","Type":"ContainerStarted","Data":"4765fa5e059c57728c32adc65abc37cc92c17540b3a9194f4f472c1b2519905e"} Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.520957 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"923cf4a9-e116-4a84-ae06-dec150a649bc","Type":"ContainerStarted","Data":"816d35de4436d7b405fb173bebc356f9b17dff4bbc65ecaa2324cbcca19f3412"} Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.524248 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" event={"ID":"846f753c-873c-4b5d-a31f-1bbc13cafb80","Type":"ContainerStarted","Data":"68a719572c528127ec529e7f7152b07b0b2780d2cd1064d2035bbf1e3c59aa3a"} Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.524295 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.525699 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca","Type":"ContainerStarted","Data":"08dedfcc81675f3929bde0427bd2624aa87e14bc3a9d91d26122004eda6e86c7"} Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.535604 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c346177b-4aeb-43b2-8f86-ce57d0d42c10","Type":"ContainerStarted","Data":"60c68e5779f32bf0c8f80c6e5cf2f73deeec46f961e1ce5da03aa372ce096bb0"} Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.537018 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" event={"ID":"e788c14c-6dea-49ca-a317-0ea8fd947371","Type":"ContainerDied","Data":"602bea37657cce1bdcc4c942714a6047e6e05405dbe6271f8260cee0aae9848a"} Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.537035 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-gjb2d" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.546089 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4acbf2ba-c326-445b-b6f6-11458a1dfb68","Type":"ContainerStarted","Data":"2c000626962a90b2c0cb3a7780e5edc3e5499f7f465eec8080e225d562872f45"} Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.547786 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wjjhq" event={"ID":"79962852-f159-44df-bd50-38928f3df91d","Type":"ContainerStarted","Data":"773bf9d007958b4dfbd8c932d480b46320b4b5c1a5c474ed9b0c7caa00948a1e"} Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.548770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cfp4k" event={"ID":"da235e7c-70c7-4e8e-bf34-260bfc0cb986","Type":"ContainerStarted","Data":"5548010aff310ebaba522494083f9b4eb2b4b64ee85e8ad756d7f817bff9e1a5"} Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.550209 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8bf385ea-f77a-4773-9c0c-e57f611707db","Type":"ContainerStarted","Data":"0a6dfc93a264ae5ec03e23412973b42d371d89d8f78983afdc3e5e56c0162193"} Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.551848 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" podStartSLOduration=5.03930191 podStartE2EDuration="15.551830094s" podCreationTimestamp="2025-10-01 16:17:07 +0000 UTC" firstStartedPulling="2025-10-01 16:17:10.697079558 +0000 UTC m=+893.696726393" lastFinishedPulling="2025-10-01 16:17:21.209607742 +0000 UTC m=+904.209254577" observedRunningTime="2025-10-01 16:17:22.539966811 +0000 UTC m=+905.539613646" watchObservedRunningTime="2025-10-01 16:17:22.551830094 +0000 UTC m=+905.551476929" Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.574964 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tf6fb"] Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.579717 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-tf6fb"] Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.618365 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gjb2d"] Oct 01 16:17:22 crc kubenswrapper[4764]: I1001 16:17:22.622657 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-gjb2d"] Oct 01 16:17:23 crc kubenswrapper[4764]: I1001 16:17:23.732225 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a6e396-f88e-4a43-a3c2-b6f951b87a77" path="/var/lib/kubelet/pods/04a6e396-f88e-4a43-a3c2-b6f951b87a77/volumes" Oct 01 16:17:23 crc kubenswrapper[4764]: I1001 16:17:23.732948 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e788c14c-6dea-49ca-a317-0ea8fd947371" path="/var/lib/kubelet/pods/e788c14c-6dea-49ca-a317-0ea8fd947371/volumes" Oct 01 16:17:28 crc kubenswrapper[4764]: I1001 16:17:28.299975 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:17:28 crc kubenswrapper[4764]: I1001 16:17:28.371116 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nsgv9"] Oct 01 16:17:30 crc kubenswrapper[4764]: I1001 16:17:30.607749 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"923cf4a9-e116-4a84-ae06-dec150a649bc","Type":"ContainerStarted","Data":"fd6052ccffef13cac568b65c257cf14cfca4cd3eb859c0748c18914b45096d0d"} Oct 01 16:17:30 crc kubenswrapper[4764]: I1001 16:17:30.610620 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4acbf2ba-c326-445b-b6f6-11458a1dfb68","Type":"ContainerStarted","Data":"cbd7d1f6cf35ac79ad329ae845f08c24ac9ae0ab3213967dae3dae75942215a7"} Oct 01 16:17:30 crc kubenswrapper[4764]: I1001 16:17:30.613659 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca","Type":"ContainerStarted","Data":"c6c4e89c6ed525edbd137d5babc635932fe8583bf88f0005c5a6fc3d56e874e0"} Oct 01 16:17:30 crc kubenswrapper[4764]: I1001 16:17:30.616905 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c346177b-4aeb-43b2-8f86-ce57d0d42c10","Type":"ContainerStarted","Data":"d345bee3122068f5190f9dbbd342a890bbb83943bbafc1e6fa51537f8962a1c1"} Oct 01 16:17:30 crc kubenswrapper[4764]: I1001 16:17:30.617106 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 01 16:17:30 crc kubenswrapper[4764]: I1001 16:17:30.619489 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cfp4k" event={"ID":"da235e7c-70c7-4e8e-bf34-260bfc0cb986","Type":"ContainerStarted","Data":"180affc0d3da63bc92c63672d1021e8072c8c28f33bb79b3b624ccc100d7f48b"} Oct 01 16:17:30 crc kubenswrapper[4764]: I1001 16:17:30.624805 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" event={"ID":"16c6ad23-1b5b-4a4e-bd34-a34240c67a33","Type":"ContainerStarted","Data":"c8ea6cead417254d41ec944bf68db83749ea639bb033613039b9424f38d6b5d8"} Oct 01 16:17:30 crc kubenswrapper[4764]: I1001 16:17:30.624942 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" podUID="16c6ad23-1b5b-4a4e-bd34-a34240c67a33" containerName="dnsmasq-dns" containerID="cri-o://c8ea6cead417254d41ec944bf68db83749ea639bb033613039b9424f38d6b5d8" gracePeriod=10 Oct 01 16:17:30 crc kubenswrapper[4764]: I1001 16:17:30.625120 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:30 crc kubenswrapper[4764]: I1001 16:17:30.662621 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.59844651 podStartE2EDuration="18.662598398s" podCreationTimestamp="2025-10-01 16:17:12 +0000 UTC" firstStartedPulling="2025-10-01 16:17:21.795946417 +0000 UTC m=+904.795593252" lastFinishedPulling="2025-10-01 16:17:26.860098305 +0000 UTC m=+909.859745140" observedRunningTime="2025-10-01 16:17:30.659469701 +0000 UTC m=+913.659116546" watchObservedRunningTime="2025-10-01 16:17:30.662598398 +0000 UTC m=+913.662245223" Oct 01 16:17:30 crc kubenswrapper[4764]: I1001 16:17:30.711433 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" podStartSLOduration=10.959948886 podStartE2EDuration="23.711416013s" podCreationTimestamp="2025-10-01 16:17:07 +0000 UTC" firstStartedPulling="2025-10-01 16:17:08.361405279 +0000 UTC m=+891.361052114" lastFinishedPulling="2025-10-01 16:17:21.112872406 +0000 UTC m=+904.112519241" observedRunningTime="2025-10-01 16:17:30.705239161 +0000 UTC m=+913.704885996" watchObservedRunningTime="2025-10-01 16:17:30.711416013 +0000 UTC m=+913.711062848" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.167946 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.283122 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpt2d\" (UniqueName: \"kubernetes.io/projected/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-kube-api-access-bpt2d\") pod \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\" (UID: \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\") " Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.283242 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-config\") pod \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\" (UID: \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\") " Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.283272 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-dns-svc\") pod \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\" (UID: \"16c6ad23-1b5b-4a4e-bd34-a34240c67a33\") " Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.376805 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-kube-api-access-bpt2d" (OuterVolumeSpecName: "kube-api-access-bpt2d") pod "16c6ad23-1b5b-4a4e-bd34-a34240c67a33" (UID: "16c6ad23-1b5b-4a4e-bd34-a34240c67a33"). InnerVolumeSpecName "kube-api-access-bpt2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.385180 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpt2d\" (UniqueName: \"kubernetes.io/projected/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-kube-api-access-bpt2d\") on node \"crc\" DevicePath \"\"" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.393481 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-config" (OuterVolumeSpecName: "config") pod "16c6ad23-1b5b-4a4e-bd34-a34240c67a33" (UID: "16c6ad23-1b5b-4a4e-bd34-a34240c67a33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.397494 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "16c6ad23-1b5b-4a4e-bd34-a34240c67a33" (UID: "16c6ad23-1b5b-4a4e-bd34-a34240c67a33"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.487115 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.487149 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16c6ad23-1b5b-4a4e-bd34-a34240c67a33-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.635176 4764 generic.go:334] "Generic (PLEG): container finished" podID="da235e7c-70c7-4e8e-bf34-260bfc0cb986" containerID="180affc0d3da63bc92c63672d1021e8072c8c28f33bb79b3b624ccc100d7f48b" exitCode=0 Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.635253 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cfp4k" event={"ID":"da235e7c-70c7-4e8e-bf34-260bfc0cb986","Type":"ContainerDied","Data":"180affc0d3da63bc92c63672d1021e8072c8c28f33bb79b3b624ccc100d7f48b"} Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.636125 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"479f4015-9972-4350-bac3-6292b0c962ec","Type":"ContainerStarted","Data":"52ac9acba897b8e9057364607d74dceba80df0919088af0a8b45b103e07de86e"} Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.637878 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8bf385ea-f77a-4773-9c0c-e57f611707db","Type":"ContainerStarted","Data":"e646fa409a32957d53b50952baee94a73d06574d1a11a66aef2050c5c9358aa5"} Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.648631 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e870edc1-ed3c-4c16-8c20-cde661ac4ce0","Type":"ContainerStarted","Data":"318d8794424eadcb1e0019506fb177699eefb6155d2c58eda493383572f9febf"} Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.651889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0","Type":"ContainerStarted","Data":"2b3b93ad83c9b0feb3ca66b4088ea581e3c521c205141150b25c1136ba31fed3"} Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.652042 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.658548 4764 generic.go:334] "Generic (PLEG): container finished" podID="16c6ad23-1b5b-4a4e-bd34-a34240c67a33" containerID="c8ea6cead417254d41ec944bf68db83749ea639bb033613039b9424f38d6b5d8" exitCode=0 Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.658604 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.658637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" event={"ID":"16c6ad23-1b5b-4a4e-bd34-a34240c67a33","Type":"ContainerDied","Data":"c8ea6cead417254d41ec944bf68db83749ea639bb033613039b9424f38d6b5d8"} Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.658669 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nsgv9" event={"ID":"16c6ad23-1b5b-4a4e-bd34-a34240c67a33","Type":"ContainerDied","Data":"0b1b5eb86e33a2ade12ae0fa47180e9c0a5afa95432774ea4fc43759047a16c9"} Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.658689 4764 scope.go:117] "RemoveContainer" containerID="c8ea6cead417254d41ec944bf68db83749ea639bb033613039b9424f38d6b5d8" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.680479 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wjjhq" event={"ID":"79962852-f159-44df-bd50-38928f3df91d","Type":"ContainerStarted","Data":"6fd7083b5ed0cea3d54d98a908729607e2f2f948f88dfb5c2df7d07deec2dd16"} Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.681165 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-wjjhq" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.684829 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.644693902 podStartE2EDuration="17.684804165s" podCreationTimestamp="2025-10-01 16:17:14 +0000 UTC" firstStartedPulling="2025-10-01 16:17:21.662282169 +0000 UTC m=+904.661929004" lastFinishedPulling="2025-10-01 16:17:29.702392432 +0000 UTC m=+912.702039267" observedRunningTime="2025-10-01 16:17:31.66997534 +0000 UTC m=+914.669622185" watchObservedRunningTime="2025-10-01 16:17:31.684804165 +0000 UTC m=+914.684451030" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.708455 4764 scope.go:117] "RemoveContainer" containerID="9f8dc157fa5e56de70b3e2bbee80639abf3a1c62bc93739f03342edeba72f318" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.763163 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wjjhq" podStartSLOduration=6.079805956 podStartE2EDuration="13.763135587s" podCreationTimestamp="2025-10-01 16:17:18 +0000 UTC" firstStartedPulling="2025-10-01 16:17:21.954803425 +0000 UTC m=+904.954450260" lastFinishedPulling="2025-10-01 16:17:29.638133016 +0000 UTC m=+912.637779891" observedRunningTime="2025-10-01 16:17:31.743343939 +0000 UTC m=+914.742990814" watchObservedRunningTime="2025-10-01 16:17:31.763135587 +0000 UTC m=+914.762782462" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.764814 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nsgv9"] Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.772035 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nsgv9"] Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.803869 4764 scope.go:117] "RemoveContainer" containerID="c8ea6cead417254d41ec944bf68db83749ea639bb033613039b9424f38d6b5d8" Oct 01 16:17:31 crc kubenswrapper[4764]: E1001 16:17:31.804884 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8ea6cead417254d41ec944bf68db83749ea639bb033613039b9424f38d6b5d8\": container with ID starting with c8ea6cead417254d41ec944bf68db83749ea639bb033613039b9424f38d6b5d8 not found: ID does not exist" containerID="c8ea6cead417254d41ec944bf68db83749ea639bb033613039b9424f38d6b5d8" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.804933 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ea6cead417254d41ec944bf68db83749ea639bb033613039b9424f38d6b5d8"} err="failed to get container status \"c8ea6cead417254d41ec944bf68db83749ea639bb033613039b9424f38d6b5d8\": rpc error: code = NotFound desc = could not find container \"c8ea6cead417254d41ec944bf68db83749ea639bb033613039b9424f38d6b5d8\": container with ID starting with c8ea6cead417254d41ec944bf68db83749ea639bb033613039b9424f38d6b5d8 not found: ID does not exist" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.804966 4764 scope.go:117] "RemoveContainer" containerID="9f8dc157fa5e56de70b3e2bbee80639abf3a1c62bc93739f03342edeba72f318" Oct 01 16:17:31 crc kubenswrapper[4764]: E1001 16:17:31.805731 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f8dc157fa5e56de70b3e2bbee80639abf3a1c62bc93739f03342edeba72f318\": container with ID starting with 9f8dc157fa5e56de70b3e2bbee80639abf3a1c62bc93739f03342edeba72f318 not found: ID does not exist" containerID="9f8dc157fa5e56de70b3e2bbee80639abf3a1c62bc93739f03342edeba72f318" Oct 01 16:17:31 crc kubenswrapper[4764]: I1001 16:17:31.805764 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8dc157fa5e56de70b3e2bbee80639abf3a1c62bc93739f03342edeba72f318"} err="failed to get container status \"9f8dc157fa5e56de70b3e2bbee80639abf3a1c62bc93739f03342edeba72f318\": rpc error: code = NotFound desc = could not find container \"9f8dc157fa5e56de70b3e2bbee80639abf3a1c62bc93739f03342edeba72f318\": container with ID starting with 9f8dc157fa5e56de70b3e2bbee80639abf3a1c62bc93739f03342edeba72f318 not found: ID does not exist" Oct 01 16:17:32 crc kubenswrapper[4764]: I1001 16:17:32.689186 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cfp4k" event={"ID":"da235e7c-70c7-4e8e-bf34-260bfc0cb986","Type":"ContainerStarted","Data":"2e4fab11cc576c32e9e39565ce60dfea015b08680321b867b803ac977efa8238"} Oct 01 16:17:33 crc kubenswrapper[4764]: I1001 16:17:33.732770 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16c6ad23-1b5b-4a4e-bd34-a34240c67a33" path="/var/lib/kubelet/pods/16c6ad23-1b5b-4a4e-bd34-a34240c67a33/volumes" Oct 01 16:17:34 crc kubenswrapper[4764]: I1001 16:17:34.710387 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e870edc1-ed3c-4c16-8c20-cde661ac4ce0","Type":"ContainerStarted","Data":"52dc424f8571bc62b216073cba5cbad43278e6038d07f3703eaf84ddcf449df1"} Oct 01 16:17:34 crc kubenswrapper[4764]: I1001 16:17:34.713891 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"923cf4a9-e116-4a84-ae06-dec150a649bc","Type":"ContainerStarted","Data":"827eef3a3c6c4a5375efcee928e0403069ef7f718101783b4e54b3ae967f470d"} Oct 01 16:17:34 crc kubenswrapper[4764]: I1001 16:17:34.716374 4764 generic.go:334] "Generic (PLEG): container finished" podID="4acbf2ba-c326-445b-b6f6-11458a1dfb68" containerID="cbd7d1f6cf35ac79ad329ae845f08c24ac9ae0ab3213967dae3dae75942215a7" exitCode=0 Oct 01 16:17:34 crc kubenswrapper[4764]: I1001 16:17:34.716460 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4acbf2ba-c326-445b-b6f6-11458a1dfb68","Type":"ContainerDied","Data":"cbd7d1f6cf35ac79ad329ae845f08c24ac9ae0ab3213967dae3dae75942215a7"} Oct 01 16:17:34 crc kubenswrapper[4764]: I1001 16:17:34.718571 4764 generic.go:334] "Generic (PLEG): container finished" podID="4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca" containerID="c6c4e89c6ed525edbd137d5babc635932fe8583bf88f0005c5a6fc3d56e874e0" exitCode=0 Oct 01 16:17:34 crc kubenswrapper[4764]: I1001 16:17:34.718604 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca","Type":"ContainerDied","Data":"c6c4e89c6ed525edbd137d5babc635932fe8583bf88f0005c5a6fc3d56e874e0"} Oct 01 16:17:34 crc kubenswrapper[4764]: I1001 16:17:34.722026 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cfp4k" event={"ID":"da235e7c-70c7-4e8e-bf34-260bfc0cb986","Type":"ContainerStarted","Data":"0599d63cafd3296fc946ac82572235b69b70914e800a9425b8beef7af9f471d2"} Oct 01 16:17:34 crc kubenswrapper[4764]: I1001 16:17:34.722408 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:34 crc kubenswrapper[4764]: I1001 16:17:34.722471 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:17:34 crc kubenswrapper[4764]: I1001 16:17:34.751926 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.076796514 podStartE2EDuration="17.751907688s" podCreationTimestamp="2025-10-01 16:17:17 +0000 UTC" firstStartedPulling="2025-10-01 16:17:21.953289058 +0000 UTC m=+904.952935893" lastFinishedPulling="2025-10-01 16:17:33.628400232 +0000 UTC m=+916.628047067" observedRunningTime="2025-10-01 16:17:34.741753448 +0000 UTC m=+917.741400293" watchObservedRunningTime="2025-10-01 16:17:34.751907688 +0000 UTC m=+917.751554533" Oct 01 16:17:34 crc kubenswrapper[4764]: I1001 16:17:34.783512 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-cfp4k" podStartSLOduration=9.201295621 podStartE2EDuration="16.783491298s" podCreationTimestamp="2025-10-01 16:17:18 +0000 UTC" firstStartedPulling="2025-10-01 16:17:22.055840907 +0000 UTC m=+905.055487742" lastFinishedPulling="2025-10-01 16:17:29.638036544 +0000 UTC m=+912.637683419" observedRunningTime="2025-10-01 16:17:34.777433947 +0000 UTC m=+917.777080872" watchObservedRunningTime="2025-10-01 16:17:34.783491298 +0000 UTC m=+917.783138143" Oct 01 16:17:34 crc kubenswrapper[4764]: I1001 16:17:34.815232 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.6625212620000003 podStartE2EDuration="14.815199599s" podCreationTimestamp="2025-10-01 16:17:20 +0000 UTC" firstStartedPulling="2025-10-01 16:17:22.464856147 +0000 UTC m=+905.464502992" lastFinishedPulling="2025-10-01 16:17:33.617534484 +0000 UTC m=+916.617181329" observedRunningTime="2025-10-01 16:17:34.797016081 +0000 UTC m=+917.796662926" watchObservedRunningTime="2025-10-01 16:17:34.815199599 +0000 UTC m=+917.814846484" Oct 01 16:17:35 crc kubenswrapper[4764]: I1001 16:17:35.738380 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4acbf2ba-c326-445b-b6f6-11458a1dfb68","Type":"ContainerStarted","Data":"359eb0d9c4607bc3a1012451a29533bdd196abceaacd8d267f0148abe1ef1ec8"} Oct 01 16:17:35 crc kubenswrapper[4764]: I1001 16:17:35.738912 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca","Type":"ContainerStarted","Data":"43a08cc5670bd937248b9754074c86c490aab604db1c403d4509cf378a35321d"} Oct 01 16:17:35 crc kubenswrapper[4764]: I1001 16:17:35.809529 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.335733257 podStartE2EDuration="25.809504438s" podCreationTimestamp="2025-10-01 16:17:10 +0000 UTC" firstStartedPulling="2025-10-01 16:17:21.680661362 +0000 UTC m=+904.680308197" lastFinishedPulling="2025-10-01 16:17:29.154432543 +0000 UTC m=+912.154079378" observedRunningTime="2025-10-01 16:17:35.777511889 +0000 UTC m=+918.777158754" watchObservedRunningTime="2025-10-01 16:17:35.809504438 +0000 UTC m=+918.809151303" Oct 01 16:17:35 crc kubenswrapper[4764]: I1001 16:17:35.816532 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.459695614 podStartE2EDuration="25.816512251s" podCreationTimestamp="2025-10-01 16:17:10 +0000 UTC" firstStartedPulling="2025-10-01 16:17:21.8171784 +0000 UTC m=+904.816825235" lastFinishedPulling="2025-10-01 16:17:28.173995037 +0000 UTC m=+911.173641872" observedRunningTime="2025-10-01 16:17:35.807257293 +0000 UTC m=+918.806904168" watchObservedRunningTime="2025-10-01 16:17:35.816512251 +0000 UTC m=+918.816159136" Oct 01 16:17:36 crc kubenswrapper[4764]: I1001 16:17:36.492724 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:36 crc kubenswrapper[4764]: I1001 16:17:36.564976 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:36 crc kubenswrapper[4764]: I1001 16:17:36.750986 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:36 crc kubenswrapper[4764]: I1001 16:17:36.821797 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 01 16:17:36 crc kubenswrapper[4764]: I1001 16:17:36.855731 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:36 crc kubenswrapper[4764]: I1001 16:17:36.855833 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:36 crc kubenswrapper[4764]: I1001 16:17:36.930388 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.107842 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gntz4"] Oct 01 16:17:37 crc kubenswrapper[4764]: E1001 16:17:37.108347 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c6ad23-1b5b-4a4e-bd34-a34240c67a33" containerName="init" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.108372 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c6ad23-1b5b-4a4e-bd34-a34240c67a33" containerName="init" Oct 01 16:17:37 crc kubenswrapper[4764]: E1001 16:17:37.108387 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c6ad23-1b5b-4a4e-bd34-a34240c67a33" containerName="dnsmasq-dns" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.108396 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c6ad23-1b5b-4a4e-bd34-a34240c67a33" containerName="dnsmasq-dns" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.108593 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c6ad23-1b5b-4a4e-bd34-a34240c67a33" containerName="dnsmasq-dns" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.109754 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.115607 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.127224 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gntz4"] Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.138227 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-r8kfk"] Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.139231 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.141387 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.166406 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-r8kfk"] Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.186616 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnggr\" (UniqueName: \"kubernetes.io/projected/7297f71c-9c31-4790-87b0-8b80a7abe531-kube-api-access-bnggr\") pod \"dnsmasq-dns-7fd796d7df-gntz4\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.186662 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-gntz4\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.186716 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-config\") pod \"dnsmasq-dns-7fd796d7df-gntz4\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.186916 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-gntz4\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.288631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnggr\" (UniqueName: \"kubernetes.io/projected/7297f71c-9c31-4790-87b0-8b80a7abe531-kube-api-access-bnggr\") pod \"dnsmasq-dns-7fd796d7df-gntz4\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.288692 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-gntz4\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.288730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80dd0b93-add9-4524-8cef-a32b4250e094-combined-ca-bundle\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.288763 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/80dd0b93-add9-4524-8cef-a32b4250e094-ovs-rundir\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.288797 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwxwv\" (UniqueName: \"kubernetes.io/projected/80dd0b93-add9-4524-8cef-a32b4250e094-kube-api-access-fwxwv\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.288831 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/80dd0b93-add9-4524-8cef-a32b4250e094-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.288858 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-config\") pod \"dnsmasq-dns-7fd796d7df-gntz4\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.288989 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/80dd0b93-add9-4524-8cef-a32b4250e094-ovn-rundir\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.289030 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-gntz4\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.289134 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80dd0b93-add9-4524-8cef-a32b4250e094-config\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.289630 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-gntz4\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.289854 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-gntz4\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.289931 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-config\") pod \"dnsmasq-dns-7fd796d7df-gntz4\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.314873 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnggr\" (UniqueName: \"kubernetes.io/projected/7297f71c-9c31-4790-87b0-8b80a7abe531-kube-api-access-bnggr\") pod \"dnsmasq-dns-7fd796d7df-gntz4\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.350863 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gntz4"] Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.351626 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.382857 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-5tgg8"] Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.384059 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.385909 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.390567 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/80dd0b93-add9-4524-8cef-a32b4250e094-ovn-rundir\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.390636 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80dd0b93-add9-4524-8cef-a32b4250e094-config\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.390682 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80dd0b93-add9-4524-8cef-a32b4250e094-combined-ca-bundle\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.390704 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/80dd0b93-add9-4524-8cef-a32b4250e094-ovs-rundir\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.390728 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwxwv\" (UniqueName: \"kubernetes.io/projected/80dd0b93-add9-4524-8cef-a32b4250e094-kube-api-access-fwxwv\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.390751 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/80dd0b93-add9-4524-8cef-a32b4250e094-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.391605 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/80dd0b93-add9-4524-8cef-a32b4250e094-ovs-rundir\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.391916 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/80dd0b93-add9-4524-8cef-a32b4250e094-ovn-rundir\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.392674 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80dd0b93-add9-4524-8cef-a32b4250e094-config\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.395234 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/80dd0b93-add9-4524-8cef-a32b4250e094-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.395972 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80dd0b93-add9-4524-8cef-a32b4250e094-combined-ca-bundle\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.406861 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-5tgg8"] Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.409521 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwxwv\" (UniqueName: \"kubernetes.io/projected/80dd0b93-add9-4524-8cef-a32b4250e094-kube-api-access-fwxwv\") pod \"ovn-controller-metrics-r8kfk\" (UID: \"80dd0b93-add9-4524-8cef-a32b4250e094\") " pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.456687 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-r8kfk" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.492455 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.492553 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-config\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.492609 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.492688 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.492718 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkf55\" (UniqueName: \"kubernetes.io/projected/1f5264d7-844c-4394-916a-efaed2507401-kube-api-access-xkf55\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.594242 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-config\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.594598 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.594668 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkf55\" (UniqueName: \"kubernetes.io/projected/1f5264d7-844c-4394-916a-efaed2507401-kube-api-access-xkf55\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.594687 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.594724 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.595186 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-config\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.595456 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.595770 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.596147 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.610263 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.615365 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkf55\" (UniqueName: \"kubernetes.io/projected/1f5264d7-844c-4394-916a-efaed2507401-kube-api-access-xkf55\") pod \"dnsmasq-dns-86db49b7ff-5tgg8\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.702825 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-r8kfk"] Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.759437 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-r8kfk" event={"ID":"80dd0b93-add9-4524-8cef-a32b4250e094","Type":"ContainerStarted","Data":"67eca9bda992ca1db3e949d3d50ca8d60f19f1aeca46ca03ad5c4a015ee7268d"} Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.781570 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.804877 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.809758 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gntz4"] Oct 01 16:17:37 crc kubenswrapper[4764]: W1001 16:17:37.827146 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7297f71c_9c31_4790_87b0_8b80a7abe531.slice/crio-4e5ddf1ffd12bee442bbd0d88703c4c09f99faea5595fc994b8cc174e1e79c33 WatchSource:0}: Error finding container 4e5ddf1ffd12bee442bbd0d88703c4c09f99faea5595fc994b8cc174e1e79c33: Status 404 returned error can't find the container with id 4e5ddf1ffd12bee442bbd0d88703c4c09f99faea5595fc994b8cc174e1e79c33 Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.961562 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.969395 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.972574 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.972764 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5dntz" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.972831 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.972954 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 01 16:17:37 crc kubenswrapper[4764]: I1001 16:17:37.993184 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.101718 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.101765 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.101840 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.101908 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-scripts\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.101951 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.101986 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx8tr\" (UniqueName: \"kubernetes.io/projected/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-kube-api-access-wx8tr\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.102076 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-config\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.205039 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-scripts\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.206217 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.206309 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx8tr\" (UniqueName: \"kubernetes.io/projected/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-kube-api-access-wx8tr\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.206418 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-config\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.206495 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.206527 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.206624 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.206654 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-scripts\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.206785 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.207532 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-config\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.211875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.211891 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.211890 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.222974 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx8tr\" (UniqueName: \"kubernetes.io/projected/ca14b5f3-e2fc-4fc1-9800-d64209a4c266-kube-api-access-wx8tr\") pod \"ovn-northd-0\" (UID: \"ca14b5f3-e2fc-4fc1-9800-d64209a4c266\") " pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.315418 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.344987 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-5tgg8"] Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.768274 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" event={"ID":"7297f71c-9c31-4790-87b0-8b80a7abe531","Type":"ContainerStarted","Data":"4e5ddf1ffd12bee442bbd0d88703c4c09f99faea5595fc994b8cc174e1e79c33"} Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.769906 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" event={"ID":"1f5264d7-844c-4394-916a-efaed2507401","Type":"ContainerStarted","Data":"a782ff8caa7867881b889bbbdb3abaacb301f4b280a922e22c51c8e8672ebb87"} Oct 01 16:17:38 crc kubenswrapper[4764]: I1001 16:17:38.775599 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 16:17:38 crc kubenswrapper[4764]: W1001 16:17:38.779382 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca14b5f3_e2fc_4fc1_9800_d64209a4c266.slice/crio-794af6fb69af8c10066111686be7e9f18db16d0bd8956541a0120b2a52b605f7 WatchSource:0}: Error finding container 794af6fb69af8c10066111686be7e9f18db16d0bd8956541a0120b2a52b605f7: Status 404 returned error can't find the container with id 794af6fb69af8c10066111686be7e9f18db16d0bd8956541a0120b2a52b605f7 Oct 01 16:17:39 crc kubenswrapper[4764]: I1001 16:17:39.783713 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ca14b5f3-e2fc-4fc1-9800-d64209a4c266","Type":"ContainerStarted","Data":"794af6fb69af8c10066111686be7e9f18db16d0bd8956541a0120b2a52b605f7"} Oct 01 16:17:42 crc kubenswrapper[4764]: I1001 16:17:42.106194 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 01 16:17:42 crc kubenswrapper[4764]: I1001 16:17:42.107193 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 01 16:17:42 crc kubenswrapper[4764]: I1001 16:17:42.229329 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:42 crc kubenswrapper[4764]: I1001 16:17:42.229643 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:44 crc kubenswrapper[4764]: I1001 16:17:44.714199 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 16:17:53 crc kubenswrapper[4764]: I1001 16:17:51.916619 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" event={"ID":"7297f71c-9c31-4790-87b0-8b80a7abe531","Type":"ContainerStarted","Data":"4a9965daf4e3266a6b1137e94cf5bd63dcadbd99db8cdc2bca3f7cd7ccc92550"} Oct 01 16:17:53 crc kubenswrapper[4764]: I1001 16:17:51.918841 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-r8kfk" event={"ID":"80dd0b93-add9-4524-8cef-a32b4250e094","Type":"ContainerStarted","Data":"11ebd578cb23103bc55c8301caedf447eb930d6370897ac6d364ec66eb636e9c"} Oct 01 16:17:54 crc kubenswrapper[4764]: E1001 16:17:54.009973 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3130079741/1\": happened during read: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified" Oct 01 16:17:54 crc kubenswrapper[4764]: E1001 16:17:54.010823 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-northd,Image:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,Command:[/usr/bin/ovn-northd],Args:[-vfile:off -vconsole:info --n-threads=1 --ovnnb-db=ssl:ovsdbserver-nb-0.openstack.svc.cluster.local:6641 --ovnsb-db=ssl:ovsdbserver-sb-0.openstack.svc.cluster.local:6642 --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:certs,Value:n598h674h59dhc8h78h55fh5bchcdh655hd4h696h5hd9h697hb6h658hf7h66fh694h568h64fh55bh659h5cdh5c4hd9h59ch5fch669h574h9ch7cq,ValueFrom:nil,},EnvVar{Name:ovnnorthd-config,Value:n5c8h7ch56bh8dh8hc4h5dch9dh68h6bhb7h598h549h5dbh66fh6bh5b4h5cch5d6h55ch57fhfch588h89h5ddh5d6h65bh65bh8dhc4h67dh569q,ValueFrom:nil,},EnvVar{Name:ovnnorthd-scripts,Value:n664hd8h66ch58dh64hc9h66bhd4h558h697h67bh557hdch664h567h669h555h696h556h556h5fh5bh569hbh665h9dh4h9bh564hc8h5b7h5c4q,ValueFrom:nil,},EnvVar{Name:tls-ca-bundle.pem,Value:n84h69h5d6h676h68hfch5bdh5d6hdh686h697h67fh699h98hcfh577h544h576h567h66fh579h89hf5hf9h5f8h586h676h9dh5c7h688h5c6h54bq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-northd-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wx8tr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/status_check.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/status_check.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-northd-0_openstack(ca14b5f3-e2fc-4fc1-9800-d64209a4c266): ErrImagePull: rpc error: code = Canceled desc = writing blob: storing blob to file \"/var/tmp/container_images_storage3130079741/1\": happened during read: context canceled" logger="UnhandledError" Oct 01 16:17:54 crc kubenswrapper[4764]: E1001 16:17:54.266202 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ErrImagePull: \"rpc error: code = Canceled desc = writing blob: storing blob to file \\\"/var/tmp/container_images_storage3130079741/1\\\": happened during read: context canceled\"" pod="openstack/ovn-northd-0" podUID="ca14b5f3-e2fc-4fc1-9800-d64209a4c266" Oct 01 16:17:54 crc kubenswrapper[4764]: I1001 16:17:54.956893 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ca14b5f3-e2fc-4fc1-9800-d64209a4c266","Type":"ContainerStarted","Data":"c718e32e73b074654be26f2920f1b4bcf3aec64166e28f8391833ce5dff95b31"} Oct 01 16:17:54 crc kubenswrapper[4764]: E1001 16:17:54.958612 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified\\\"\"" pod="openstack/ovn-northd-0" podUID="ca14b5f3-e2fc-4fc1-9800-d64209a4c266" Oct 01 16:17:54 crc kubenswrapper[4764]: I1001 16:17:54.962674 4764 generic.go:334] "Generic (PLEG): container finished" podID="7297f71c-9c31-4790-87b0-8b80a7abe531" containerID="4a9965daf4e3266a6b1137e94cf5bd63dcadbd99db8cdc2bca3f7cd7ccc92550" exitCode=0 Oct 01 16:17:54 crc kubenswrapper[4764]: I1001 16:17:54.962752 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" event={"ID":"7297f71c-9c31-4790-87b0-8b80a7abe531","Type":"ContainerDied","Data":"4a9965daf4e3266a6b1137e94cf5bd63dcadbd99db8cdc2bca3f7cd7ccc92550"} Oct 01 16:17:54 crc kubenswrapper[4764]: I1001 16:17:54.968270 4764 generic.go:334] "Generic (PLEG): container finished" podID="1f5264d7-844c-4394-916a-efaed2507401" containerID="8ad5566f3cd1337c594d3c8b17a80b6965095025ab1b6de33c33e015c10717ed" exitCode=0 Oct 01 16:17:54 crc kubenswrapper[4764]: I1001 16:17:54.968367 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" event={"ID":"1f5264d7-844c-4394-916a-efaed2507401","Type":"ContainerDied","Data":"8ad5566f3cd1337c594d3c8b17a80b6965095025ab1b6de33c33e015c10717ed"} Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.044770 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-r8kfk" podStartSLOduration=18.044746234 podStartE2EDuration="18.044746234s" podCreationTimestamp="2025-10-01 16:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:17:55.01133043 +0000 UTC m=+938.010977265" watchObservedRunningTime="2025-10-01 16:17:55.044746234 +0000 UTC m=+938.044393069" Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.379756 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.457993 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-ovsdbserver-nb\") pod \"7297f71c-9c31-4790-87b0-8b80a7abe531\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.458145 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnggr\" (UniqueName: \"kubernetes.io/projected/7297f71c-9c31-4790-87b0-8b80a7abe531-kube-api-access-bnggr\") pod \"7297f71c-9c31-4790-87b0-8b80a7abe531\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.458294 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-config\") pod \"7297f71c-9c31-4790-87b0-8b80a7abe531\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.458392 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-dns-svc\") pod \"7297f71c-9c31-4790-87b0-8b80a7abe531\" (UID: \"7297f71c-9c31-4790-87b0-8b80a7abe531\") " Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.461776 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7297f71c-9c31-4790-87b0-8b80a7abe531-kube-api-access-bnggr" (OuterVolumeSpecName: "kube-api-access-bnggr") pod "7297f71c-9c31-4790-87b0-8b80a7abe531" (UID: "7297f71c-9c31-4790-87b0-8b80a7abe531"). InnerVolumeSpecName "kube-api-access-bnggr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.481350 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-config" (OuterVolumeSpecName: "config") pod "7297f71c-9c31-4790-87b0-8b80a7abe531" (UID: "7297f71c-9c31-4790-87b0-8b80a7abe531"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.492697 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7297f71c-9c31-4790-87b0-8b80a7abe531" (UID: "7297f71c-9c31-4790-87b0-8b80a7abe531"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.493525 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7297f71c-9c31-4790-87b0-8b80a7abe531" (UID: "7297f71c-9c31-4790-87b0-8b80a7abe531"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.560519 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.560559 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.560573 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7297f71c-9c31-4790-87b0-8b80a7abe531-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.560586 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnggr\" (UniqueName: \"kubernetes.io/projected/7297f71c-9c31-4790-87b0-8b80a7abe531-kube-api-access-bnggr\") on node \"crc\" DevicePath \"\"" Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.983331 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" event={"ID":"1f5264d7-844c-4394-916a-efaed2507401","Type":"ContainerStarted","Data":"d0cae7baa939d9e85671dd02e05b486f90b972e32f73e3e1ead09eda1322f2cc"} Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.984799 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.989377 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.990485 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gntz4" event={"ID":"7297f71c-9c31-4790-87b0-8b80a7abe531","Type":"ContainerDied","Data":"4e5ddf1ffd12bee442bbd0d88703c4c09f99faea5595fc994b8cc174e1e79c33"} Oct 01 16:17:55 crc kubenswrapper[4764]: I1001 16:17:55.990537 4764 scope.go:117] "RemoveContainer" containerID="4a9965daf4e3266a6b1137e94cf5bd63dcadbd99db8cdc2bca3f7cd7ccc92550" Oct 01 16:17:55 crc kubenswrapper[4764]: E1001 16:17:55.996237 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-northd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified\\\"\"" pod="openstack/ovn-northd-0" podUID="ca14b5f3-e2fc-4fc1-9800-d64209a4c266" Oct 01 16:17:56 crc kubenswrapper[4764]: I1001 16:17:56.008158 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" podStartSLOduration=19.00813889 podStartE2EDuration="19.00813889s" podCreationTimestamp="2025-10-01 16:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:17:56.004911021 +0000 UTC m=+939.004557896" watchObservedRunningTime="2025-10-01 16:17:56.00813889 +0000 UTC m=+939.007785735" Oct 01 16:17:56 crc kubenswrapper[4764]: I1001 16:17:56.093236 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gntz4"] Oct 01 16:17:56 crc kubenswrapper[4764]: I1001 16:17:56.095382 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gntz4"] Oct 01 16:17:56 crc kubenswrapper[4764]: I1001 16:17:56.275491 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 01 16:17:56 crc kubenswrapper[4764]: I1001 16:17:56.333767 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 01 16:17:57 crc kubenswrapper[4764]: I1001 16:17:57.731638 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7297f71c-9c31-4790-87b0-8b80a7abe531" path="/var/lib/kubelet/pods/7297f71c-9c31-4790-87b0-8b80a7abe531/volumes" Oct 01 16:17:57 crc kubenswrapper[4764]: I1001 16:17:57.849026 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8c9sc"] Oct 01 16:17:57 crc kubenswrapper[4764]: E1001 16:17:57.849397 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7297f71c-9c31-4790-87b0-8b80a7abe531" containerName="init" Oct 01 16:17:57 crc kubenswrapper[4764]: I1001 16:17:57.849411 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7297f71c-9c31-4790-87b0-8b80a7abe531" containerName="init" Oct 01 16:17:57 crc kubenswrapper[4764]: I1001 16:17:57.849587 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7297f71c-9c31-4790-87b0-8b80a7abe531" containerName="init" Oct 01 16:17:57 crc kubenswrapper[4764]: I1001 16:17:57.850156 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8c9sc" Oct 01 16:17:57 crc kubenswrapper[4764]: I1001 16:17:57.858029 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8c9sc"] Oct 01 16:17:57 crc kubenswrapper[4764]: I1001 16:17:57.899931 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v6fk\" (UniqueName: \"kubernetes.io/projected/709ae6b5-8376-41e5-9d86-d6bf54ea147b-kube-api-access-7v6fk\") pod \"glance-db-create-8c9sc\" (UID: \"709ae6b5-8376-41e5-9d86-d6bf54ea147b\") " pod="openstack/glance-db-create-8c9sc" Oct 01 16:17:58 crc kubenswrapper[4764]: I1001 16:17:58.001516 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v6fk\" (UniqueName: \"kubernetes.io/projected/709ae6b5-8376-41e5-9d86-d6bf54ea147b-kube-api-access-7v6fk\") pod \"glance-db-create-8c9sc\" (UID: \"709ae6b5-8376-41e5-9d86-d6bf54ea147b\") " pod="openstack/glance-db-create-8c9sc" Oct 01 16:17:58 crc kubenswrapper[4764]: I1001 16:17:58.036576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v6fk\" (UniqueName: \"kubernetes.io/projected/709ae6b5-8376-41e5-9d86-d6bf54ea147b-kube-api-access-7v6fk\") pod \"glance-db-create-8c9sc\" (UID: \"709ae6b5-8376-41e5-9d86-d6bf54ea147b\") " pod="openstack/glance-db-create-8c9sc" Oct 01 16:17:58 crc kubenswrapper[4764]: I1001 16:17:58.209731 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8c9sc" Oct 01 16:17:58 crc kubenswrapper[4764]: I1001 16:17:58.375077 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:58 crc kubenswrapper[4764]: I1001 16:17:58.435241 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 01 16:17:58 crc kubenswrapper[4764]: I1001 16:17:58.648090 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8c9sc"] Oct 01 16:17:58 crc kubenswrapper[4764]: W1001 16:17:58.655132 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod709ae6b5_8376_41e5_9d86_d6bf54ea147b.slice/crio-834dee0d43b43eec603f8afa05e8ca1436fe6c32038d8fae867db0558f5318e5 WatchSource:0}: Error finding container 834dee0d43b43eec603f8afa05e8ca1436fe6c32038d8fae867db0558f5318e5: Status 404 returned error can't find the container with id 834dee0d43b43eec603f8afa05e8ca1436fe6c32038d8fae867db0558f5318e5 Oct 01 16:17:59 crc kubenswrapper[4764]: I1001 16:17:59.031500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8c9sc" event={"ID":"709ae6b5-8376-41e5-9d86-d6bf54ea147b","Type":"ContainerStarted","Data":"579c3a582266a04e241f31f465caff6603393ea4e58292efd17966be91509da9"} Oct 01 16:17:59 crc kubenswrapper[4764]: I1001 16:17:59.031578 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8c9sc" event={"ID":"709ae6b5-8376-41e5-9d86-d6bf54ea147b","Type":"ContainerStarted","Data":"834dee0d43b43eec603f8afa05e8ca1436fe6c32038d8fae867db0558f5318e5"} Oct 01 16:17:59 crc kubenswrapper[4764]: I1001 16:17:59.054950 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-8c9sc" podStartSLOduration=2.054923842 podStartE2EDuration="2.054923842s" podCreationTimestamp="2025-10-01 16:17:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:17:59.054817119 +0000 UTC m=+942.054463964" watchObservedRunningTime="2025-10-01 16:17:59.054923842 +0000 UTC m=+942.054570717" Oct 01 16:18:00 crc kubenswrapper[4764]: I1001 16:18:00.048250 4764 generic.go:334] "Generic (PLEG): container finished" podID="709ae6b5-8376-41e5-9d86-d6bf54ea147b" containerID="579c3a582266a04e241f31f465caff6603393ea4e58292efd17966be91509da9" exitCode=0 Oct 01 16:18:00 crc kubenswrapper[4764]: I1001 16:18:00.048514 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8c9sc" event={"ID":"709ae6b5-8376-41e5-9d86-d6bf54ea147b","Type":"ContainerDied","Data":"579c3a582266a04e241f31f465caff6603393ea4e58292efd17966be91509da9"} Oct 01 16:18:01 crc kubenswrapper[4764]: I1001 16:18:01.396257 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8c9sc" Oct 01 16:18:01 crc kubenswrapper[4764]: I1001 16:18:01.463271 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v6fk\" (UniqueName: \"kubernetes.io/projected/709ae6b5-8376-41e5-9d86-d6bf54ea147b-kube-api-access-7v6fk\") pod \"709ae6b5-8376-41e5-9d86-d6bf54ea147b\" (UID: \"709ae6b5-8376-41e5-9d86-d6bf54ea147b\") " Oct 01 16:18:01 crc kubenswrapper[4764]: I1001 16:18:01.472330 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/709ae6b5-8376-41e5-9d86-d6bf54ea147b-kube-api-access-7v6fk" (OuterVolumeSpecName: "kube-api-access-7v6fk") pod "709ae6b5-8376-41e5-9d86-d6bf54ea147b" (UID: "709ae6b5-8376-41e5-9d86-d6bf54ea147b"). InnerVolumeSpecName "kube-api-access-7v6fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:18:01 crc kubenswrapper[4764]: I1001 16:18:01.566119 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v6fk\" (UniqueName: \"kubernetes.io/projected/709ae6b5-8376-41e5-9d86-d6bf54ea147b-kube-api-access-7v6fk\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.084971 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8c9sc" event={"ID":"709ae6b5-8376-41e5-9d86-d6bf54ea147b","Type":"ContainerDied","Data":"834dee0d43b43eec603f8afa05e8ca1436fe6c32038d8fae867db0558f5318e5"} Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.085032 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="834dee0d43b43eec603f8afa05e8ca1436fe6c32038d8fae867db0558f5318e5" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.085186 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8c9sc" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.218970 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-6thlp"] Oct 01 16:18:02 crc kubenswrapper[4764]: E1001 16:18:02.220012 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="709ae6b5-8376-41e5-9d86-d6bf54ea147b" containerName="mariadb-database-create" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.220087 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="709ae6b5-8376-41e5-9d86-d6bf54ea147b" containerName="mariadb-database-create" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.220488 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="709ae6b5-8376-41e5-9d86-d6bf54ea147b" containerName="mariadb-database-create" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.221457 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6thlp" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.229825 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6thlp"] Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.277267 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfs84\" (UniqueName: \"kubernetes.io/projected/f115db7c-e622-4427-9d2b-f504a5166394-kube-api-access-mfs84\") pod \"keystone-db-create-6thlp\" (UID: \"f115db7c-e622-4427-9d2b-f504a5166394\") " pod="openstack/keystone-db-create-6thlp" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.380022 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfs84\" (UniqueName: \"kubernetes.io/projected/f115db7c-e622-4427-9d2b-f504a5166394-kube-api-access-mfs84\") pod \"keystone-db-create-6thlp\" (UID: \"f115db7c-e622-4427-9d2b-f504a5166394\") " pod="openstack/keystone-db-create-6thlp" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.419604 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfs84\" (UniqueName: \"kubernetes.io/projected/f115db7c-e622-4427-9d2b-f504a5166394-kube-api-access-mfs84\") pod \"keystone-db-create-6thlp\" (UID: \"f115db7c-e622-4427-9d2b-f504a5166394\") " pod="openstack/keystone-db-create-6thlp" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.505089 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-n85kj"] Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.506347 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n85kj" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.521978 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-n85kj"] Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.551584 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6thlp" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.582270 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54ncn\" (UniqueName: \"kubernetes.io/projected/06cb65e0-0860-4c2e-87e4-45216c0c3f9f-kube-api-access-54ncn\") pod \"placement-db-create-n85kj\" (UID: \"06cb65e0-0860-4c2e-87e4-45216c0c3f9f\") " pod="openstack/placement-db-create-n85kj" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.683849 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54ncn\" (UniqueName: \"kubernetes.io/projected/06cb65e0-0860-4c2e-87e4-45216c0c3f9f-kube-api-access-54ncn\") pod \"placement-db-create-n85kj\" (UID: \"06cb65e0-0860-4c2e-87e4-45216c0c3f9f\") " pod="openstack/placement-db-create-n85kj" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.709538 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54ncn\" (UniqueName: \"kubernetes.io/projected/06cb65e0-0860-4c2e-87e4-45216c0c3f9f-kube-api-access-54ncn\") pod \"placement-db-create-n85kj\" (UID: \"06cb65e0-0860-4c2e-87e4-45216c0c3f9f\") " pod="openstack/placement-db-create-n85kj" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.784234 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.832461 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n85kj" Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.841084 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4b9c6"] Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.841411 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" podUID="846f753c-873c-4b5d-a31f-1bbc13cafb80" containerName="dnsmasq-dns" containerID="cri-o://68a719572c528127ec529e7f7152b07b0b2780d2cd1064d2035bbf1e3c59aa3a" gracePeriod=10 Oct 01 16:18:02 crc kubenswrapper[4764]: I1001 16:18:02.985717 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6thlp"] Oct 01 16:18:02 crc kubenswrapper[4764]: W1001 16:18:02.986538 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf115db7c_e622_4427_9d2b_f504a5166394.slice/crio-0e1d4e3ee67d425fe6adbd8158e7a28579d6396441f0f6373d14ce928e24dd03 WatchSource:0}: Error finding container 0e1d4e3ee67d425fe6adbd8158e7a28579d6396441f0f6373d14ce928e24dd03: Status 404 returned error can't find the container with id 0e1d4e3ee67d425fe6adbd8158e7a28579d6396441f0f6373d14ce928e24dd03 Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.091545 4764 generic.go:334] "Generic (PLEG): container finished" podID="846f753c-873c-4b5d-a31f-1bbc13cafb80" containerID="68a719572c528127ec529e7f7152b07b0b2780d2cd1064d2035bbf1e3c59aa3a" exitCode=0 Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.091595 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" event={"ID":"846f753c-873c-4b5d-a31f-1bbc13cafb80","Type":"ContainerDied","Data":"68a719572c528127ec529e7f7152b07b0b2780d2cd1064d2035bbf1e3c59aa3a"} Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.092345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6thlp" event={"ID":"f115db7c-e622-4427-9d2b-f504a5166394","Type":"ContainerStarted","Data":"0e1d4e3ee67d425fe6adbd8158e7a28579d6396441f0f6373d14ce928e24dd03"} Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.258005 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.291691 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnpxk\" (UniqueName: \"kubernetes.io/projected/846f753c-873c-4b5d-a31f-1bbc13cafb80-kube-api-access-jnpxk\") pod \"846f753c-873c-4b5d-a31f-1bbc13cafb80\" (UID: \"846f753c-873c-4b5d-a31f-1bbc13cafb80\") " Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.291945 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/846f753c-873c-4b5d-a31f-1bbc13cafb80-config\") pod \"846f753c-873c-4b5d-a31f-1bbc13cafb80\" (UID: \"846f753c-873c-4b5d-a31f-1bbc13cafb80\") " Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.291976 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/846f753c-873c-4b5d-a31f-1bbc13cafb80-dns-svc\") pod \"846f753c-873c-4b5d-a31f-1bbc13cafb80\" (UID: \"846f753c-873c-4b5d-a31f-1bbc13cafb80\") " Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.296479 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/846f753c-873c-4b5d-a31f-1bbc13cafb80-kube-api-access-jnpxk" (OuterVolumeSpecName: "kube-api-access-jnpxk") pod "846f753c-873c-4b5d-a31f-1bbc13cafb80" (UID: "846f753c-873c-4b5d-a31f-1bbc13cafb80"). InnerVolumeSpecName "kube-api-access-jnpxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.332108 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/846f753c-873c-4b5d-a31f-1bbc13cafb80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "846f753c-873c-4b5d-a31f-1bbc13cafb80" (UID: "846f753c-873c-4b5d-a31f-1bbc13cafb80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.332396 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-n85kj"] Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.338242 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/846f753c-873c-4b5d-a31f-1bbc13cafb80-config" (OuterVolumeSpecName: "config") pod "846f753c-873c-4b5d-a31f-1bbc13cafb80" (UID: "846f753c-873c-4b5d-a31f-1bbc13cafb80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.394819 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/846f753c-873c-4b5d-a31f-1bbc13cafb80-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.394851 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/846f753c-873c-4b5d-a31f-1bbc13cafb80-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.394865 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnpxk\" (UniqueName: \"kubernetes.io/projected/846f753c-873c-4b5d-a31f-1bbc13cafb80-kube-api-access-jnpxk\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.805767 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.821574 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wjjhq" podUID="79962852-f159-44df-bd50-38928f3df91d" containerName="ovn-controller" probeResult="failure" output=< Oct 01 16:18:03 crc kubenswrapper[4764]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 16:18:03 crc kubenswrapper[4764]: > Oct 01 16:18:03 crc kubenswrapper[4764]: I1001 16:18:03.845025 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cfp4k" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.045692 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wjjhq-config-dxwmj"] Oct 01 16:18:04 crc kubenswrapper[4764]: E1001 16:18:04.046148 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846f753c-873c-4b5d-a31f-1bbc13cafb80" containerName="init" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.046175 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="846f753c-873c-4b5d-a31f-1bbc13cafb80" containerName="init" Oct 01 16:18:04 crc kubenswrapper[4764]: E1001 16:18:04.046218 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846f753c-873c-4b5d-a31f-1bbc13cafb80" containerName="dnsmasq-dns" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.046230 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="846f753c-873c-4b5d-a31f-1bbc13cafb80" containerName="dnsmasq-dns" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.046496 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="846f753c-873c-4b5d-a31f-1bbc13cafb80" containerName="dnsmasq-dns" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.047260 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.049960 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.075806 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wjjhq-config-dxwmj"] Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.105342 4764 generic.go:334] "Generic (PLEG): container finished" podID="f115db7c-e622-4427-9d2b-f504a5166394" containerID="3b066cbb5e42838720cdb004db4995687c2d1c3fbef00a5540c9c32ee17e5f35" exitCode=0 Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.105410 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6thlp" event={"ID":"f115db7c-e622-4427-9d2b-f504a5166394","Type":"ContainerDied","Data":"3b066cbb5e42838720cdb004db4995687c2d1c3fbef00a5540c9c32ee17e5f35"} Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.106341 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-run\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.106363 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-additional-scripts\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.106380 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-run-ovn\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.106403 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdqqg\" (UniqueName: \"kubernetes.io/projected/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-kube-api-access-xdqqg\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.106437 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-log-ovn\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.106507 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-scripts\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.107740 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" event={"ID":"846f753c-873c-4b5d-a31f-1bbc13cafb80","Type":"ContainerDied","Data":"6307b728d39f4462f52a969587e9a001f12695b5e4bbf482b9e7b5d9fe817d86"} Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.107779 4764 scope.go:117] "RemoveContainer" containerID="68a719572c528127ec529e7f7152b07b0b2780d2cd1064d2035bbf1e3c59aa3a" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.107886 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4b9c6" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.110336 4764 generic.go:334] "Generic (PLEG): container finished" podID="06cb65e0-0860-4c2e-87e4-45216c0c3f9f" containerID="7bd6a26a8e132d970d352635b691d736a7f59bbf9f84345e6347c5f5a25870d2" exitCode=0 Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.110379 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n85kj" event={"ID":"06cb65e0-0860-4c2e-87e4-45216c0c3f9f","Type":"ContainerDied","Data":"7bd6a26a8e132d970d352635b691d736a7f59bbf9f84345e6347c5f5a25870d2"} Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.110396 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n85kj" event={"ID":"06cb65e0-0860-4c2e-87e4-45216c0c3f9f","Type":"ContainerStarted","Data":"de4645e30957045a108eb2f5713c351fb0b63053269a3ce83ba602488282a654"} Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.117591 4764 generic.go:334] "Generic (PLEG): container finished" podID="479f4015-9972-4350-bac3-6292b0c962ec" containerID="52ac9acba897b8e9057364607d74dceba80df0919088af0a8b45b103e07de86e" exitCode=0 Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.117641 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"479f4015-9972-4350-bac3-6292b0c962ec","Type":"ContainerDied","Data":"52ac9acba897b8e9057364607d74dceba80df0919088af0a8b45b103e07de86e"} Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.120640 4764 generic.go:334] "Generic (PLEG): container finished" podID="8bf385ea-f77a-4773-9c0c-e57f611707db" containerID="e646fa409a32957d53b50952baee94a73d06574d1a11a66aef2050c5c9358aa5" exitCode=0 Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.120869 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8bf385ea-f77a-4773-9c0c-e57f611707db","Type":"ContainerDied","Data":"e646fa409a32957d53b50952baee94a73d06574d1a11a66aef2050c5c9358aa5"} Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.145173 4764 scope.go:117] "RemoveContainer" containerID="2adbe3afd8b5b27f6eb158018480410c8dd0d579f3b1852aa09d96d506304f0c" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.207735 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-run\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.207772 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-additional-scripts\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.207793 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-run-ovn\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.207825 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdqqg\" (UniqueName: \"kubernetes.io/projected/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-kube-api-access-xdqqg\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.207872 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-log-ovn\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.207954 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-scripts\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.208797 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-additional-scripts\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.209128 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-run\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.210740 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-scripts\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.211010 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-run-ovn\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.211126 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-log-ovn\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.249700 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdqqg\" (UniqueName: \"kubernetes.io/projected/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-kube-api-access-xdqqg\") pod \"ovn-controller-wjjhq-config-dxwmj\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.288799 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4b9c6"] Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.299976 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4b9c6"] Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.410904 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:04 crc kubenswrapper[4764]: I1001 16:18:04.711559 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wjjhq-config-dxwmj"] Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.135898 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8bf385ea-f77a-4773-9c0c-e57f611707db","Type":"ContainerStarted","Data":"908bccfc60adc71acc7a3a4522bf29f9aab31c7e5ab0e8b886fb96ba8a0efd8d"} Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.136901 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.143416 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wjjhq-config-dxwmj" event={"ID":"ea7c65b6-74a1-4d58-9be5-3b508e9e981a","Type":"ContainerStarted","Data":"dbfd793092b34e3c0747c852b866b5c2dff7f267daec4c089c6705eb975fa6f8"} Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.147583 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"479f4015-9972-4350-bac3-6292b0c962ec","Type":"ContainerStarted","Data":"a093782315604cb982af0dc1d2bcb839738cf84bf6b0a72c1f2a8949785e648f"} Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.160404 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.574487091 podStartE2EDuration="58.160386368s" podCreationTimestamp="2025-10-01 16:17:07 +0000 UTC" firstStartedPulling="2025-10-01 16:17:21.568492205 +0000 UTC m=+904.568139040" lastFinishedPulling="2025-10-01 16:17:29.154391482 +0000 UTC m=+912.154038317" observedRunningTime="2025-10-01 16:18:05.159121987 +0000 UTC m=+948.158768822" watchObservedRunningTime="2025-10-01 16:18:05.160386368 +0000 UTC m=+948.160033203" Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.205591 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.199425018 podStartE2EDuration="58.205574402s" podCreationTimestamp="2025-10-01 16:17:07 +0000 UTC" firstStartedPulling="2025-10-01 16:17:21.662300439 +0000 UTC m=+904.661947274" lastFinishedPulling="2025-10-01 16:17:29.668449823 +0000 UTC m=+912.668096658" observedRunningTime="2025-10-01 16:18:05.204182538 +0000 UTC m=+948.203829373" watchObservedRunningTime="2025-10-01 16:18:05.205574402 +0000 UTC m=+948.205221237" Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.627600 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n85kj" Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.633809 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6thlp" Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.733288 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="846f753c-873c-4b5d-a31f-1bbc13cafb80" path="/var/lib/kubelet/pods/846f753c-873c-4b5d-a31f-1bbc13cafb80/volumes" Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.744949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54ncn\" (UniqueName: \"kubernetes.io/projected/06cb65e0-0860-4c2e-87e4-45216c0c3f9f-kube-api-access-54ncn\") pod \"06cb65e0-0860-4c2e-87e4-45216c0c3f9f\" (UID: \"06cb65e0-0860-4c2e-87e4-45216c0c3f9f\") " Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.744987 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfs84\" (UniqueName: \"kubernetes.io/projected/f115db7c-e622-4427-9d2b-f504a5166394-kube-api-access-mfs84\") pod \"f115db7c-e622-4427-9d2b-f504a5166394\" (UID: \"f115db7c-e622-4427-9d2b-f504a5166394\") " Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.753371 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f115db7c-e622-4427-9d2b-f504a5166394-kube-api-access-mfs84" (OuterVolumeSpecName: "kube-api-access-mfs84") pod "f115db7c-e622-4427-9d2b-f504a5166394" (UID: "f115db7c-e622-4427-9d2b-f504a5166394"). InnerVolumeSpecName "kube-api-access-mfs84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.753406 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06cb65e0-0860-4c2e-87e4-45216c0c3f9f-kube-api-access-54ncn" (OuterVolumeSpecName: "kube-api-access-54ncn") pod "06cb65e0-0860-4c2e-87e4-45216c0c3f9f" (UID: "06cb65e0-0860-4c2e-87e4-45216c0c3f9f"). InnerVolumeSpecName "kube-api-access-54ncn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.846944 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54ncn\" (UniqueName: \"kubernetes.io/projected/06cb65e0-0860-4c2e-87e4-45216c0c3f9f-kube-api-access-54ncn\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:05 crc kubenswrapper[4764]: I1001 16:18:05.846975 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfs84\" (UniqueName: \"kubernetes.io/projected/f115db7c-e622-4427-9d2b-f504a5166394-kube-api-access-mfs84\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:06 crc kubenswrapper[4764]: I1001 16:18:06.158943 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6thlp" event={"ID":"f115db7c-e622-4427-9d2b-f504a5166394","Type":"ContainerDied","Data":"0e1d4e3ee67d425fe6adbd8158e7a28579d6396441f0f6373d14ce928e24dd03"} Oct 01 16:18:06 crc kubenswrapper[4764]: I1001 16:18:06.159320 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e1d4e3ee67d425fe6adbd8158e7a28579d6396441f0f6373d14ce928e24dd03" Oct 01 16:18:06 crc kubenswrapper[4764]: I1001 16:18:06.159408 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6thlp" Oct 01 16:18:06 crc kubenswrapper[4764]: I1001 16:18:06.171422 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-n85kj" event={"ID":"06cb65e0-0860-4c2e-87e4-45216c0c3f9f","Type":"ContainerDied","Data":"de4645e30957045a108eb2f5713c351fb0b63053269a3ce83ba602488282a654"} Oct 01 16:18:06 crc kubenswrapper[4764]: I1001 16:18:06.171486 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de4645e30957045a108eb2f5713c351fb0b63053269a3ce83ba602488282a654" Oct 01 16:18:06 crc kubenswrapper[4764]: I1001 16:18:06.171588 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-n85kj" Oct 01 16:18:06 crc kubenswrapper[4764]: I1001 16:18:06.181286 4764 generic.go:334] "Generic (PLEG): container finished" podID="ea7c65b6-74a1-4d58-9be5-3b508e9e981a" containerID="4b2caa0febd16d804f1c435f43a3182c494d53301c7d0b8255e18cd8268fc965" exitCode=0 Oct 01 16:18:06 crc kubenswrapper[4764]: I1001 16:18:06.182880 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wjjhq-config-dxwmj" event={"ID":"ea7c65b6-74a1-4d58-9be5-3b508e9e981a","Type":"ContainerDied","Data":"4b2caa0febd16d804f1c435f43a3182c494d53301c7d0b8255e18cd8268fc965"} Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.585342 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.683060 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdqqg\" (UniqueName: \"kubernetes.io/projected/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-kube-api-access-xdqqg\") pod \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.683313 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-run\") pod \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.683428 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-log-ovn\") pod \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.683521 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-additional-scripts\") pod \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.683437 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-run" (OuterVolumeSpecName: "var-run") pod "ea7c65b6-74a1-4d58-9be5-3b508e9e981a" (UID: "ea7c65b6-74a1-4d58-9be5-3b508e9e981a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.683488 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ea7c65b6-74a1-4d58-9be5-3b508e9e981a" (UID: "ea7c65b6-74a1-4d58-9be5-3b508e9e981a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.683719 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-run-ovn\") pod \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.683838 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-scripts\") pod \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\" (UID: \"ea7c65b6-74a1-4d58-9be5-3b508e9e981a\") " Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.683952 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ea7c65b6-74a1-4d58-9be5-3b508e9e981a" (UID: "ea7c65b6-74a1-4d58-9be5-3b508e9e981a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.684405 4764 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.684500 4764 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-run\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.684572 4764 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.684433 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ea7c65b6-74a1-4d58-9be5-3b508e9e981a" (UID: "ea7c65b6-74a1-4d58-9be5-3b508e9e981a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.684699 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-scripts" (OuterVolumeSpecName: "scripts") pod "ea7c65b6-74a1-4d58-9be5-3b508e9e981a" (UID: "ea7c65b6-74a1-4d58-9be5-3b508e9e981a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.691787 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-kube-api-access-xdqqg" (OuterVolumeSpecName: "kube-api-access-xdqqg") pod "ea7c65b6-74a1-4d58-9be5-3b508e9e981a" (UID: "ea7c65b6-74a1-4d58-9be5-3b508e9e981a"). InnerVolumeSpecName "kube-api-access-xdqqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.786332 4764 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.786363 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.786373 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdqqg\" (UniqueName: \"kubernetes.io/projected/ea7c65b6-74a1-4d58-9be5-3b508e9e981a-kube-api-access-xdqqg\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.956578 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8220-account-create-qbvl5"] Oct 01 16:18:07 crc kubenswrapper[4764]: E1001 16:18:07.957030 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7c65b6-74a1-4d58-9be5-3b508e9e981a" containerName="ovn-config" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.957086 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7c65b6-74a1-4d58-9be5-3b508e9e981a" containerName="ovn-config" Oct 01 16:18:07 crc kubenswrapper[4764]: E1001 16:18:07.957117 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06cb65e0-0860-4c2e-87e4-45216c0c3f9f" containerName="mariadb-database-create" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.957133 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="06cb65e0-0860-4c2e-87e4-45216c0c3f9f" containerName="mariadb-database-create" Oct 01 16:18:07 crc kubenswrapper[4764]: E1001 16:18:07.957175 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f115db7c-e622-4427-9d2b-f504a5166394" containerName="mariadb-database-create" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.957187 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f115db7c-e622-4427-9d2b-f504a5166394" containerName="mariadb-database-create" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.957496 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea7c65b6-74a1-4d58-9be5-3b508e9e981a" containerName="ovn-config" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.957530 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="06cb65e0-0860-4c2e-87e4-45216c0c3f9f" containerName="mariadb-database-create" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.957571 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f115db7c-e622-4427-9d2b-f504a5166394" containerName="mariadb-database-create" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.958405 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8220-account-create-qbvl5" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.962644 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 01 16:18:07 crc kubenswrapper[4764]: I1001 16:18:07.973150 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8220-account-create-qbvl5"] Oct 01 16:18:08 crc kubenswrapper[4764]: I1001 16:18:08.091081 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjvrf\" (UniqueName: \"kubernetes.io/projected/13af3cfe-d62e-4fa0-8444-36ec829d18fa-kube-api-access-tjvrf\") pod \"glance-8220-account-create-qbvl5\" (UID: \"13af3cfe-d62e-4fa0-8444-36ec829d18fa\") " pod="openstack/glance-8220-account-create-qbvl5" Oct 01 16:18:08 crc kubenswrapper[4764]: I1001 16:18:08.194692 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjvrf\" (UniqueName: \"kubernetes.io/projected/13af3cfe-d62e-4fa0-8444-36ec829d18fa-kube-api-access-tjvrf\") pod \"glance-8220-account-create-qbvl5\" (UID: \"13af3cfe-d62e-4fa0-8444-36ec829d18fa\") " pod="openstack/glance-8220-account-create-qbvl5" Oct 01 16:18:08 crc kubenswrapper[4764]: I1001 16:18:08.205185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wjjhq-config-dxwmj" event={"ID":"ea7c65b6-74a1-4d58-9be5-3b508e9e981a","Type":"ContainerDied","Data":"dbfd793092b34e3c0747c852b866b5c2dff7f267daec4c089c6705eb975fa6f8"} Oct 01 16:18:08 crc kubenswrapper[4764]: I1001 16:18:08.205381 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wjjhq-config-dxwmj" Oct 01 16:18:08 crc kubenswrapper[4764]: I1001 16:18:08.205486 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbfd793092b34e3c0747c852b866b5c2dff7f267daec4c089c6705eb975fa6f8" Oct 01 16:18:08 crc kubenswrapper[4764]: I1001 16:18:08.244813 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjvrf\" (UniqueName: \"kubernetes.io/projected/13af3cfe-d62e-4fa0-8444-36ec829d18fa-kube-api-access-tjvrf\") pod \"glance-8220-account-create-qbvl5\" (UID: \"13af3cfe-d62e-4fa0-8444-36ec829d18fa\") " pod="openstack/glance-8220-account-create-qbvl5" Oct 01 16:18:08 crc kubenswrapper[4764]: I1001 16:18:08.275717 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8220-account-create-qbvl5" Oct 01 16:18:08 crc kubenswrapper[4764]: I1001 16:18:08.599465 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8220-account-create-qbvl5"] Oct 01 16:18:08 crc kubenswrapper[4764]: W1001 16:18:08.612028 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13af3cfe_d62e_4fa0_8444_36ec829d18fa.slice/crio-2ce883ea5cf67ff0358602c52e6e04fece281445c950a2890d3a7c20a6c47fcd WatchSource:0}: Error finding container 2ce883ea5cf67ff0358602c52e6e04fece281445c950a2890d3a7c20a6c47fcd: Status 404 returned error can't find the container with id 2ce883ea5cf67ff0358602c52e6e04fece281445c950a2890d3a7c20a6c47fcd Oct 01 16:18:08 crc kubenswrapper[4764]: I1001 16:18:08.699455 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wjjhq-config-dxwmj"] Oct 01 16:18:08 crc kubenswrapper[4764]: I1001 16:18:08.706557 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wjjhq-config-dxwmj"] Oct 01 16:18:08 crc kubenswrapper[4764]: I1001 16:18:08.829559 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-wjjhq" Oct 01 16:18:09 crc kubenswrapper[4764]: I1001 16:18:09.023888 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:18:09 crc kubenswrapper[4764]: I1001 16:18:09.218642 4764 generic.go:334] "Generic (PLEG): container finished" podID="13af3cfe-d62e-4fa0-8444-36ec829d18fa" containerID="3277286fa513e00b97faffd5689150739eacd19aceef9423cbc6a24ff57b4888" exitCode=0 Oct 01 16:18:09 crc kubenswrapper[4764]: I1001 16:18:09.218707 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8220-account-create-qbvl5" event={"ID":"13af3cfe-d62e-4fa0-8444-36ec829d18fa","Type":"ContainerDied","Data":"3277286fa513e00b97faffd5689150739eacd19aceef9423cbc6a24ff57b4888"} Oct 01 16:18:09 crc kubenswrapper[4764]: I1001 16:18:09.218748 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8220-account-create-qbvl5" event={"ID":"13af3cfe-d62e-4fa0-8444-36ec829d18fa","Type":"ContainerStarted","Data":"2ce883ea5cf67ff0358602c52e6e04fece281445c950a2890d3a7c20a6c47fcd"} Oct 01 16:18:09 crc kubenswrapper[4764]: I1001 16:18:09.738798 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea7c65b6-74a1-4d58-9be5-3b508e9e981a" path="/var/lib/kubelet/pods/ea7c65b6-74a1-4d58-9be5-3b508e9e981a/volumes" Oct 01 16:18:10 crc kubenswrapper[4764]: I1001 16:18:10.543983 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8220-account-create-qbvl5" Oct 01 16:18:10 crc kubenswrapper[4764]: I1001 16:18:10.636394 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjvrf\" (UniqueName: \"kubernetes.io/projected/13af3cfe-d62e-4fa0-8444-36ec829d18fa-kube-api-access-tjvrf\") pod \"13af3cfe-d62e-4fa0-8444-36ec829d18fa\" (UID: \"13af3cfe-d62e-4fa0-8444-36ec829d18fa\") " Oct 01 16:18:10 crc kubenswrapper[4764]: I1001 16:18:10.644778 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13af3cfe-d62e-4fa0-8444-36ec829d18fa-kube-api-access-tjvrf" (OuterVolumeSpecName: "kube-api-access-tjvrf") pod "13af3cfe-d62e-4fa0-8444-36ec829d18fa" (UID: "13af3cfe-d62e-4fa0-8444-36ec829d18fa"). InnerVolumeSpecName "kube-api-access-tjvrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:18:10 crc kubenswrapper[4764]: I1001 16:18:10.738631 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjvrf\" (UniqueName: \"kubernetes.io/projected/13af3cfe-d62e-4fa0-8444-36ec829d18fa-kube-api-access-tjvrf\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:11 crc kubenswrapper[4764]: I1001 16:18:11.238757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8220-account-create-qbvl5" event={"ID":"13af3cfe-d62e-4fa0-8444-36ec829d18fa","Type":"ContainerDied","Data":"2ce883ea5cf67ff0358602c52e6e04fece281445c950a2890d3a7c20a6c47fcd"} Oct 01 16:18:11 crc kubenswrapper[4764]: I1001 16:18:11.238805 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ce883ea5cf67ff0358602c52e6e04fece281445c950a2890d3a7c20a6c47fcd" Oct 01 16:18:11 crc kubenswrapper[4764]: I1001 16:18:11.238852 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8220-account-create-qbvl5" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.249245 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ca14b5f3-e2fc-4fc1-9800-d64209a4c266","Type":"ContainerStarted","Data":"58ad3322b20d7a73a00363c97ef4e997b23df884435b925073dfafc3dbf2023c"} Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.249839 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.268804 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.4018835689999998 podStartE2EDuration="35.268783666s" podCreationTimestamp="2025-10-01 16:17:37 +0000 UTC" firstStartedPulling="2025-10-01 16:17:38.781579557 +0000 UTC m=+921.781226402" lastFinishedPulling="2025-10-01 16:18:11.648479654 +0000 UTC m=+954.648126499" observedRunningTime="2025-10-01 16:18:12.267710069 +0000 UTC m=+955.267356904" watchObservedRunningTime="2025-10-01 16:18:12.268783666 +0000 UTC m=+955.268430511" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.339938 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9a9c-account-create-snkg4"] Oct 01 16:18:12 crc kubenswrapper[4764]: E1001 16:18:12.340393 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13af3cfe-d62e-4fa0-8444-36ec829d18fa" containerName="mariadb-account-create" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.340417 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="13af3cfe-d62e-4fa0-8444-36ec829d18fa" containerName="mariadb-account-create" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.340623 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="13af3cfe-d62e-4fa0-8444-36ec829d18fa" containerName="mariadb-account-create" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.341339 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a9c-account-create-snkg4" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.343543 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.349239 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9a9c-account-create-snkg4"] Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.476150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgbbb\" (UniqueName: \"kubernetes.io/projected/05fa05d2-7e6c-4586-b5c3-712309bbebb5-kube-api-access-zgbbb\") pod \"keystone-9a9c-account-create-snkg4\" (UID: \"05fa05d2-7e6c-4586-b5c3-712309bbebb5\") " pod="openstack/keystone-9a9c-account-create-snkg4" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.577917 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgbbb\" (UniqueName: \"kubernetes.io/projected/05fa05d2-7e6c-4586-b5c3-712309bbebb5-kube-api-access-zgbbb\") pod \"keystone-9a9c-account-create-snkg4\" (UID: \"05fa05d2-7e6c-4586-b5c3-712309bbebb5\") " pod="openstack/keystone-9a9c-account-create-snkg4" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.597725 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgbbb\" (UniqueName: \"kubernetes.io/projected/05fa05d2-7e6c-4586-b5c3-712309bbebb5-kube-api-access-zgbbb\") pod \"keystone-9a9c-account-create-snkg4\" (UID: \"05fa05d2-7e6c-4586-b5c3-712309bbebb5\") " pod="openstack/keystone-9a9c-account-create-snkg4" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.658356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a9c-account-create-snkg4" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.658913 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3be7-account-create-gbfns"] Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.660211 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3be7-account-create-gbfns" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.674288 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3be7-account-create-gbfns"] Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.674725 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.790714 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwc58\" (UniqueName: \"kubernetes.io/projected/185220ad-4140-4a41-b928-7b68e15408ba-kube-api-access-jwc58\") pod \"placement-3be7-account-create-gbfns\" (UID: \"185220ad-4140-4a41-b928-7b68e15408ba\") " pod="openstack/placement-3be7-account-create-gbfns" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.892687 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwc58\" (UniqueName: \"kubernetes.io/projected/185220ad-4140-4a41-b928-7b68e15408ba-kube-api-access-jwc58\") pod \"placement-3be7-account-create-gbfns\" (UID: \"185220ad-4140-4a41-b928-7b68e15408ba\") " pod="openstack/placement-3be7-account-create-gbfns" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.919197 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwc58\" (UniqueName: \"kubernetes.io/projected/185220ad-4140-4a41-b928-7b68e15408ba-kube-api-access-jwc58\") pod \"placement-3be7-account-create-gbfns\" (UID: \"185220ad-4140-4a41-b928-7b68e15408ba\") " pod="openstack/placement-3be7-account-create-gbfns" Oct 01 16:18:12 crc kubenswrapper[4764]: I1001 16:18:12.921935 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9a9c-account-create-snkg4"] Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.040208 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3be7-account-create-gbfns" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.118354 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-7qnvc"] Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.120274 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.122456 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.126151 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s7qgl" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.133154 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7qnvc"] Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.196432 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-db-sync-config-data\") pod \"glance-db-sync-7qnvc\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.196475 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-config-data\") pod \"glance-db-sync-7qnvc\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.196502 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-combined-ca-bundle\") pod \"glance-db-sync-7qnvc\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.196543 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rzdc\" (UniqueName: \"kubernetes.io/projected/071b3286-64c8-4945-952e-3ba22f94e118-kube-api-access-4rzdc\") pod \"glance-db-sync-7qnvc\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.257153 4764 generic.go:334] "Generic (PLEG): container finished" podID="05fa05d2-7e6c-4586-b5c3-712309bbebb5" containerID="7a54dc2196e6680c4efe8571c163327d370e0d111e3b3fabc746991c92249694" exitCode=0 Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.257197 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9a9c-account-create-snkg4" event={"ID":"05fa05d2-7e6c-4586-b5c3-712309bbebb5","Type":"ContainerDied","Data":"7a54dc2196e6680c4efe8571c163327d370e0d111e3b3fabc746991c92249694"} Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.257249 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9a9c-account-create-snkg4" event={"ID":"05fa05d2-7e6c-4586-b5c3-712309bbebb5","Type":"ContainerStarted","Data":"dfb94200b6a46537288ffa21829051ed2222c4f608a4254368566a7c672db019"} Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.298511 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rzdc\" (UniqueName: \"kubernetes.io/projected/071b3286-64c8-4945-952e-3ba22f94e118-kube-api-access-4rzdc\") pod \"glance-db-sync-7qnvc\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.298713 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-db-sync-config-data\") pod \"glance-db-sync-7qnvc\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.298766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-config-data\") pod \"glance-db-sync-7qnvc\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.298808 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-combined-ca-bundle\") pod \"glance-db-sync-7qnvc\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.304806 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-db-sync-config-data\") pod \"glance-db-sync-7qnvc\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.306060 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-config-data\") pod \"glance-db-sync-7qnvc\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.306094 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-combined-ca-bundle\") pod \"glance-db-sync-7qnvc\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.313796 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rzdc\" (UniqueName: \"kubernetes.io/projected/071b3286-64c8-4945-952e-3ba22f94e118-kube-api-access-4rzdc\") pod \"glance-db-sync-7qnvc\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.361986 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3be7-account-create-gbfns"] Oct 01 16:18:13 crc kubenswrapper[4764]: W1001 16:18:13.367528 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod185220ad_4140_4a41_b928_7b68e15408ba.slice/crio-e5aaf76da9ea25ef9765e514f40d2b3502b2eb392727fa3bfd1ab95be84d0adb WatchSource:0}: Error finding container e5aaf76da9ea25ef9765e514f40d2b3502b2eb392727fa3bfd1ab95be84d0adb: Status 404 returned error can't find the container with id e5aaf76da9ea25ef9765e514f40d2b3502b2eb392727fa3bfd1ab95be84d0adb Oct 01 16:18:13 crc kubenswrapper[4764]: I1001 16:18:13.482923 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7qnvc" Oct 01 16:18:14 crc kubenswrapper[4764]: I1001 16:18:14.013897 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7qnvc"] Oct 01 16:18:14 crc kubenswrapper[4764]: W1001 16:18:14.028356 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod071b3286_64c8_4945_952e_3ba22f94e118.slice/crio-ad681b09edf009091672dfdae7e30046357822953cfd1b9ed996989a138e6208 WatchSource:0}: Error finding container ad681b09edf009091672dfdae7e30046357822953cfd1b9ed996989a138e6208: Status 404 returned error can't find the container with id ad681b09edf009091672dfdae7e30046357822953cfd1b9ed996989a138e6208 Oct 01 16:18:14 crc kubenswrapper[4764]: I1001 16:18:14.269768 4764 generic.go:334] "Generic (PLEG): container finished" podID="185220ad-4140-4a41-b928-7b68e15408ba" containerID="3f4cdfb94c2237a5095e2964b9fb5d01e8886ac535dc4da381d022b373593f40" exitCode=0 Oct 01 16:18:14 crc kubenswrapper[4764]: I1001 16:18:14.269835 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3be7-account-create-gbfns" event={"ID":"185220ad-4140-4a41-b928-7b68e15408ba","Type":"ContainerDied","Data":"3f4cdfb94c2237a5095e2964b9fb5d01e8886ac535dc4da381d022b373593f40"} Oct 01 16:18:14 crc kubenswrapper[4764]: I1001 16:18:14.269864 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3be7-account-create-gbfns" event={"ID":"185220ad-4140-4a41-b928-7b68e15408ba","Type":"ContainerStarted","Data":"e5aaf76da9ea25ef9765e514f40d2b3502b2eb392727fa3bfd1ab95be84d0adb"} Oct 01 16:18:14 crc kubenswrapper[4764]: I1001 16:18:14.271318 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7qnvc" event={"ID":"071b3286-64c8-4945-952e-3ba22f94e118","Type":"ContainerStarted","Data":"ad681b09edf009091672dfdae7e30046357822953cfd1b9ed996989a138e6208"} Oct 01 16:18:14 crc kubenswrapper[4764]: I1001 16:18:14.628023 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a9c-account-create-snkg4" Oct 01 16:18:14 crc kubenswrapper[4764]: I1001 16:18:14.726442 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgbbb\" (UniqueName: \"kubernetes.io/projected/05fa05d2-7e6c-4586-b5c3-712309bbebb5-kube-api-access-zgbbb\") pod \"05fa05d2-7e6c-4586-b5c3-712309bbebb5\" (UID: \"05fa05d2-7e6c-4586-b5c3-712309bbebb5\") " Oct 01 16:18:14 crc kubenswrapper[4764]: I1001 16:18:14.743839 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05fa05d2-7e6c-4586-b5c3-712309bbebb5-kube-api-access-zgbbb" (OuterVolumeSpecName: "kube-api-access-zgbbb") pod "05fa05d2-7e6c-4586-b5c3-712309bbebb5" (UID: "05fa05d2-7e6c-4586-b5c3-712309bbebb5"). InnerVolumeSpecName "kube-api-access-zgbbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:18:14 crc kubenswrapper[4764]: I1001 16:18:14.828331 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgbbb\" (UniqueName: \"kubernetes.io/projected/05fa05d2-7e6c-4586-b5c3-712309bbebb5-kube-api-access-zgbbb\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:15 crc kubenswrapper[4764]: I1001 16:18:15.284382 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9a9c-account-create-snkg4" Oct 01 16:18:15 crc kubenswrapper[4764]: I1001 16:18:15.285166 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9a9c-account-create-snkg4" event={"ID":"05fa05d2-7e6c-4586-b5c3-712309bbebb5","Type":"ContainerDied","Data":"dfb94200b6a46537288ffa21829051ed2222c4f608a4254368566a7c672db019"} Oct 01 16:18:15 crc kubenswrapper[4764]: I1001 16:18:15.285199 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfb94200b6a46537288ffa21829051ed2222c4f608a4254368566a7c672db019" Oct 01 16:18:15 crc kubenswrapper[4764]: I1001 16:18:15.614429 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3be7-account-create-gbfns" Oct 01 16:18:15 crc kubenswrapper[4764]: I1001 16:18:15.765513 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwc58\" (UniqueName: \"kubernetes.io/projected/185220ad-4140-4a41-b928-7b68e15408ba-kube-api-access-jwc58\") pod \"185220ad-4140-4a41-b928-7b68e15408ba\" (UID: \"185220ad-4140-4a41-b928-7b68e15408ba\") " Oct 01 16:18:15 crc kubenswrapper[4764]: I1001 16:18:15.773287 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/185220ad-4140-4a41-b928-7b68e15408ba-kube-api-access-jwc58" (OuterVolumeSpecName: "kube-api-access-jwc58") pod "185220ad-4140-4a41-b928-7b68e15408ba" (UID: "185220ad-4140-4a41-b928-7b68e15408ba"). InnerVolumeSpecName "kube-api-access-jwc58". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:18:15 crc kubenswrapper[4764]: I1001 16:18:15.867421 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwc58\" (UniqueName: \"kubernetes.io/projected/185220ad-4140-4a41-b928-7b68e15408ba-kube-api-access-jwc58\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:16 crc kubenswrapper[4764]: I1001 16:18:16.294349 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3be7-account-create-gbfns" event={"ID":"185220ad-4140-4a41-b928-7b68e15408ba","Type":"ContainerDied","Data":"e5aaf76da9ea25ef9765e514f40d2b3502b2eb392727fa3bfd1ab95be84d0adb"} Oct 01 16:18:16 crc kubenswrapper[4764]: I1001 16:18:16.294395 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5aaf76da9ea25ef9765e514f40d2b3502b2eb392727fa3bfd1ab95be84d0adb" Oct 01 16:18:16 crc kubenswrapper[4764]: I1001 16:18:16.294457 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3be7-account-create-gbfns" Oct 01 16:18:19 crc kubenswrapper[4764]: I1001 16:18:19.027756 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:18:19 crc kubenswrapper[4764]: I1001 16:18:19.311258 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.702694 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6jx59"] Oct 01 16:18:20 crc kubenswrapper[4764]: E1001 16:18:20.703005 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05fa05d2-7e6c-4586-b5c3-712309bbebb5" containerName="mariadb-account-create" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.703017 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="05fa05d2-7e6c-4586-b5c3-712309bbebb5" containerName="mariadb-account-create" Oct 01 16:18:20 crc kubenswrapper[4764]: E1001 16:18:20.703064 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185220ad-4140-4a41-b928-7b68e15408ba" containerName="mariadb-account-create" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.703071 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="185220ad-4140-4a41-b928-7b68e15408ba" containerName="mariadb-account-create" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.703208 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="185220ad-4140-4a41-b928-7b68e15408ba" containerName="mariadb-account-create" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.703227 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="05fa05d2-7e6c-4586-b5c3-712309bbebb5" containerName="mariadb-account-create" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.703693 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6jx59" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.720190 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6jx59"] Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.801652 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-4p5lt"] Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.802599 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4p5lt" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.822709 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4p5lt"] Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.856725 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfmlj\" (UniqueName: \"kubernetes.io/projected/22612f67-a7c0-4c0c-9b45-3a2ba2ea7681-kube-api-access-vfmlj\") pod \"barbican-db-create-6jx59\" (UID: \"22612f67-a7c0-4c0c-9b45-3a2ba2ea7681\") " pod="openstack/barbican-db-create-6jx59" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.908337 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-vt657"] Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.909251 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vt657" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.929838 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vt657"] Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.958026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfmlj\" (UniqueName: \"kubernetes.io/projected/22612f67-a7c0-4c0c-9b45-3a2ba2ea7681-kube-api-access-vfmlj\") pod \"barbican-db-create-6jx59\" (UID: \"22612f67-a7c0-4c0c-9b45-3a2ba2ea7681\") " pod="openstack/barbican-db-create-6jx59" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.958115 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxxjs\" (UniqueName: \"kubernetes.io/projected/0b95bc66-5cbe-4ef5-a2db-64f94391bf65-kube-api-access-bxxjs\") pod \"cinder-db-create-4p5lt\" (UID: \"0b95bc66-5cbe-4ef5-a2db-64f94391bf65\") " pod="openstack/cinder-db-create-4p5lt" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.972595 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-242lx"] Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.973485 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-242lx" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.976439 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.976640 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9lshv" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.976898 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.977075 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.983008 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfmlj\" (UniqueName: \"kubernetes.io/projected/22612f67-a7c0-4c0c-9b45-3a2ba2ea7681-kube-api-access-vfmlj\") pod \"barbican-db-create-6jx59\" (UID: \"22612f67-a7c0-4c0c-9b45-3a2ba2ea7681\") " pod="openstack/barbican-db-create-6jx59" Oct 01 16:18:20 crc kubenswrapper[4764]: I1001 16:18:20.992741 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-242lx"] Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.035940 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6jx59" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.059823 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thjsz\" (UniqueName: \"kubernetes.io/projected/24da24d1-d732-469a-839b-bc1aea3737d8-kube-api-access-thjsz\") pod \"keystone-db-sync-242lx\" (UID: \"24da24d1-d732-469a-839b-bc1aea3737d8\") " pod="openstack/keystone-db-sync-242lx" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.060253 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l9c4\" (UniqueName: \"kubernetes.io/projected/819d701c-51c3-4e8d-a2f4-a9e39b81d65b-kube-api-access-8l9c4\") pod \"neutron-db-create-vt657\" (UID: \"819d701c-51c3-4e8d-a2f4-a9e39b81d65b\") " pod="openstack/neutron-db-create-vt657" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.060323 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24da24d1-d732-469a-839b-bc1aea3737d8-config-data\") pod \"keystone-db-sync-242lx\" (UID: \"24da24d1-d732-469a-839b-bc1aea3737d8\") " pod="openstack/keystone-db-sync-242lx" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.060413 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxxjs\" (UniqueName: \"kubernetes.io/projected/0b95bc66-5cbe-4ef5-a2db-64f94391bf65-kube-api-access-bxxjs\") pod \"cinder-db-create-4p5lt\" (UID: \"0b95bc66-5cbe-4ef5-a2db-64f94391bf65\") " pod="openstack/cinder-db-create-4p5lt" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.060454 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24da24d1-d732-469a-839b-bc1aea3737d8-combined-ca-bundle\") pod \"keystone-db-sync-242lx\" (UID: \"24da24d1-d732-469a-839b-bc1aea3737d8\") " pod="openstack/keystone-db-sync-242lx" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.076439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxxjs\" (UniqueName: \"kubernetes.io/projected/0b95bc66-5cbe-4ef5-a2db-64f94391bf65-kube-api-access-bxxjs\") pod \"cinder-db-create-4p5lt\" (UID: \"0b95bc66-5cbe-4ef5-a2db-64f94391bf65\") " pod="openstack/cinder-db-create-4p5lt" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.122477 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4p5lt" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.161256 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24da24d1-d732-469a-839b-bc1aea3737d8-combined-ca-bundle\") pod \"keystone-db-sync-242lx\" (UID: \"24da24d1-d732-469a-839b-bc1aea3737d8\") " pod="openstack/keystone-db-sync-242lx" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.161310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thjsz\" (UniqueName: \"kubernetes.io/projected/24da24d1-d732-469a-839b-bc1aea3737d8-kube-api-access-thjsz\") pod \"keystone-db-sync-242lx\" (UID: \"24da24d1-d732-469a-839b-bc1aea3737d8\") " pod="openstack/keystone-db-sync-242lx" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.161369 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l9c4\" (UniqueName: \"kubernetes.io/projected/819d701c-51c3-4e8d-a2f4-a9e39b81d65b-kube-api-access-8l9c4\") pod \"neutron-db-create-vt657\" (UID: \"819d701c-51c3-4e8d-a2f4-a9e39b81d65b\") " pod="openstack/neutron-db-create-vt657" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.161414 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24da24d1-d732-469a-839b-bc1aea3737d8-config-data\") pod \"keystone-db-sync-242lx\" (UID: \"24da24d1-d732-469a-839b-bc1aea3737d8\") " pod="openstack/keystone-db-sync-242lx" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.164360 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24da24d1-d732-469a-839b-bc1aea3737d8-config-data\") pod \"keystone-db-sync-242lx\" (UID: \"24da24d1-d732-469a-839b-bc1aea3737d8\") " pod="openstack/keystone-db-sync-242lx" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.166710 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24da24d1-d732-469a-839b-bc1aea3737d8-combined-ca-bundle\") pod \"keystone-db-sync-242lx\" (UID: \"24da24d1-d732-469a-839b-bc1aea3737d8\") " pod="openstack/keystone-db-sync-242lx" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.185393 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thjsz\" (UniqueName: \"kubernetes.io/projected/24da24d1-d732-469a-839b-bc1aea3737d8-kube-api-access-thjsz\") pod \"keystone-db-sync-242lx\" (UID: \"24da24d1-d732-469a-839b-bc1aea3737d8\") " pod="openstack/keystone-db-sync-242lx" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.186608 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l9c4\" (UniqueName: \"kubernetes.io/projected/819d701c-51c3-4e8d-a2f4-a9e39b81d65b-kube-api-access-8l9c4\") pod \"neutron-db-create-vt657\" (UID: \"819d701c-51c3-4e8d-a2f4-a9e39b81d65b\") " pod="openstack/neutron-db-create-vt657" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.223387 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vt657" Oct 01 16:18:21 crc kubenswrapper[4764]: I1001 16:18:21.325104 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-242lx" Oct 01 16:18:23 crc kubenswrapper[4764]: I1001 16:18:23.377940 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 01 16:18:39 crc kubenswrapper[4764]: I1001 16:18:39.168596 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6jx59"] Oct 01 16:18:39 crc kubenswrapper[4764]: I1001 16:18:39.229307 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vt657"] Oct 01 16:18:39 crc kubenswrapper[4764]: E1001 16:18:39.266029 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Oct 01 16:18:39 crc kubenswrapper[4764]: E1001 16:18:39.266230 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4rzdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-7qnvc_openstack(071b3286-64c8-4945-952e-3ba22f94e118): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 16:18:39 crc kubenswrapper[4764]: E1001 16:18:39.272132 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-7qnvc" podUID="071b3286-64c8-4945-952e-3ba22f94e118" Oct 01 16:18:39 crc kubenswrapper[4764]: I1001 16:18:39.294392 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-242lx"] Oct 01 16:18:39 crc kubenswrapper[4764]: W1001 16:18:39.300906 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24da24d1_d732_469a_839b_bc1aea3737d8.slice/crio-68d280ffc5900715c4e93d2bfda0508d0c679c1996e5b323a915c7be587c2f5c WatchSource:0}: Error finding container 68d280ffc5900715c4e93d2bfda0508d0c679c1996e5b323a915c7be587c2f5c: Status 404 returned error can't find the container with id 68d280ffc5900715c4e93d2bfda0508d0c679c1996e5b323a915c7be587c2f5c Oct 01 16:18:39 crc kubenswrapper[4764]: I1001 16:18:39.304483 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4p5lt"] Oct 01 16:18:39 crc kubenswrapper[4764]: W1001 16:18:39.309733 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b95bc66_5cbe_4ef5_a2db_64f94391bf65.slice/crio-c4f90aa99a20c710728e5d62e00f63fd4f8133c4599d25ba9fca56f0ca733ca7 WatchSource:0}: Error finding container c4f90aa99a20c710728e5d62e00f63fd4f8133c4599d25ba9fca56f0ca733ca7: Status 404 returned error can't find the container with id c4f90aa99a20c710728e5d62e00f63fd4f8133c4599d25ba9fca56f0ca733ca7 Oct 01 16:18:39 crc kubenswrapper[4764]: I1001 16:18:39.535592 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-242lx" event={"ID":"24da24d1-d732-469a-839b-bc1aea3737d8","Type":"ContainerStarted","Data":"68d280ffc5900715c4e93d2bfda0508d0c679c1996e5b323a915c7be587c2f5c"} Oct 01 16:18:39 crc kubenswrapper[4764]: I1001 16:18:39.538498 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6jx59" event={"ID":"22612f67-a7c0-4c0c-9b45-3a2ba2ea7681","Type":"ContainerStarted","Data":"95ec0f90aea22f05b6529dfe7c35539225a3ac47b4d5fec9e8db24fbe416958d"} Oct 01 16:18:39 crc kubenswrapper[4764]: I1001 16:18:39.538532 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6jx59" event={"ID":"22612f67-a7c0-4c0c-9b45-3a2ba2ea7681","Type":"ContainerStarted","Data":"ee5cd76be14e8fa87b25ca60f53c1181b0f59f9eec8efc9c664ac048b59bc20a"} Oct 01 16:18:39 crc kubenswrapper[4764]: I1001 16:18:39.541561 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vt657" event={"ID":"819d701c-51c3-4e8d-a2f4-a9e39b81d65b","Type":"ContainerStarted","Data":"817a62351be6bef79b4ca476bdf4db00f071e236e6ca54a7b052a3a8e5f809d8"} Oct 01 16:18:39 crc kubenswrapper[4764]: I1001 16:18:39.541607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vt657" event={"ID":"819d701c-51c3-4e8d-a2f4-a9e39b81d65b","Type":"ContainerStarted","Data":"88ebda2db7f9de7b58549a9a3a5b75f838eabc0951452bb0a360ec4064660c62"} Oct 01 16:18:39 crc kubenswrapper[4764]: I1001 16:18:39.544791 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4p5lt" event={"ID":"0b95bc66-5cbe-4ef5-a2db-64f94391bf65","Type":"ContainerStarted","Data":"0b85c0940795c20d982fadb7c007e7cbc4dcf1de6aa55faf30bc0cf05b2e799e"} Oct 01 16:18:39 crc kubenswrapper[4764]: I1001 16:18:39.544829 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4p5lt" event={"ID":"0b95bc66-5cbe-4ef5-a2db-64f94391bf65","Type":"ContainerStarted","Data":"c4f90aa99a20c710728e5d62e00f63fd4f8133c4599d25ba9fca56f0ca733ca7"} Oct 01 16:18:39 crc kubenswrapper[4764]: E1001 16:18:39.546589 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-7qnvc" podUID="071b3286-64c8-4945-952e-3ba22f94e118" Oct 01 16:18:39 crc kubenswrapper[4764]: I1001 16:18:39.559404 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-6jx59" podStartSLOduration=19.55937637 podStartE2EDuration="19.55937637s" podCreationTimestamp="2025-10-01 16:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:18:39.552144661 +0000 UTC m=+982.551791556" watchObservedRunningTime="2025-10-01 16:18:39.55937637 +0000 UTC m=+982.559023245" Oct 01 16:18:39 crc kubenswrapper[4764]: I1001 16:18:39.576456 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-4p5lt" podStartSLOduration=19.576424561 podStartE2EDuration="19.576424561s" podCreationTimestamp="2025-10-01 16:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:18:39.564283001 +0000 UTC m=+982.563929846" watchObservedRunningTime="2025-10-01 16:18:39.576424561 +0000 UTC m=+982.576071476" Oct 01 16:18:40 crc kubenswrapper[4764]: I1001 16:18:40.557246 4764 generic.go:334] "Generic (PLEG): container finished" podID="22612f67-a7c0-4c0c-9b45-3a2ba2ea7681" containerID="95ec0f90aea22f05b6529dfe7c35539225a3ac47b4d5fec9e8db24fbe416958d" exitCode=0 Oct 01 16:18:40 crc kubenswrapper[4764]: I1001 16:18:40.557300 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6jx59" event={"ID":"22612f67-a7c0-4c0c-9b45-3a2ba2ea7681","Type":"ContainerDied","Data":"95ec0f90aea22f05b6529dfe7c35539225a3ac47b4d5fec9e8db24fbe416958d"} Oct 01 16:18:40 crc kubenswrapper[4764]: I1001 16:18:40.560286 4764 generic.go:334] "Generic (PLEG): container finished" podID="0b95bc66-5cbe-4ef5-a2db-64f94391bf65" containerID="0b85c0940795c20d982fadb7c007e7cbc4dcf1de6aa55faf30bc0cf05b2e799e" exitCode=0 Oct 01 16:18:40 crc kubenswrapper[4764]: I1001 16:18:40.560412 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4p5lt" event={"ID":"0b95bc66-5cbe-4ef5-a2db-64f94391bf65","Type":"ContainerDied","Data":"0b85c0940795c20d982fadb7c007e7cbc4dcf1de6aa55faf30bc0cf05b2e799e"} Oct 01 16:18:40 crc kubenswrapper[4764]: I1001 16:18:40.562409 4764 generic.go:334] "Generic (PLEG): container finished" podID="819d701c-51c3-4e8d-a2f4-a9e39b81d65b" containerID="817a62351be6bef79b4ca476bdf4db00f071e236e6ca54a7b052a3a8e5f809d8" exitCode=0 Oct 01 16:18:40 crc kubenswrapper[4764]: I1001 16:18:40.562429 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vt657" event={"ID":"819d701c-51c3-4e8d-a2f4-a9e39b81d65b","Type":"ContainerDied","Data":"817a62351be6bef79b4ca476bdf4db00f071e236e6ca54a7b052a3a8e5f809d8"} Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.648461 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6jx59" Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.655206 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vt657" event={"ID":"819d701c-51c3-4e8d-a2f4-a9e39b81d65b","Type":"ContainerDied","Data":"88ebda2db7f9de7b58549a9a3a5b75f838eabc0951452bb0a360ec4064660c62"} Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.655250 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88ebda2db7f9de7b58549a9a3a5b75f838eabc0951452bb0a360ec4064660c62" Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.657619 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6jx59" event={"ID":"22612f67-a7c0-4c0c-9b45-3a2ba2ea7681","Type":"ContainerDied","Data":"ee5cd76be14e8fa87b25ca60f53c1181b0f59f9eec8efc9c664ac048b59bc20a"} Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.657660 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee5cd76be14e8fa87b25ca60f53c1181b0f59f9eec8efc9c664ac048b59bc20a" Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.657624 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6jx59" Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.663662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4p5lt" event={"ID":"0b95bc66-5cbe-4ef5-a2db-64f94391bf65","Type":"ContainerDied","Data":"c4f90aa99a20c710728e5d62e00f63fd4f8133c4599d25ba9fca56f0ca733ca7"} Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.663714 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4f90aa99a20c710728e5d62e00f63fd4f8133c4599d25ba9fca56f0ca733ca7" Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.757422 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vt657" Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.817447 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4p5lt" Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.819787 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfmlj\" (UniqueName: \"kubernetes.io/projected/22612f67-a7c0-4c0c-9b45-3a2ba2ea7681-kube-api-access-vfmlj\") pod \"22612f67-a7c0-4c0c-9b45-3a2ba2ea7681\" (UID: \"22612f67-a7c0-4c0c-9b45-3a2ba2ea7681\") " Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.825341 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22612f67-a7c0-4c0c-9b45-3a2ba2ea7681-kube-api-access-vfmlj" (OuterVolumeSpecName: "kube-api-access-vfmlj") pod "22612f67-a7c0-4c0c-9b45-3a2ba2ea7681" (UID: "22612f67-a7c0-4c0c-9b45-3a2ba2ea7681"). InnerVolumeSpecName "kube-api-access-vfmlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.921008 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l9c4\" (UniqueName: \"kubernetes.io/projected/819d701c-51c3-4e8d-a2f4-a9e39b81d65b-kube-api-access-8l9c4\") pod \"819d701c-51c3-4e8d-a2f4-a9e39b81d65b\" (UID: \"819d701c-51c3-4e8d-a2f4-a9e39b81d65b\") " Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.921440 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxxjs\" (UniqueName: \"kubernetes.io/projected/0b95bc66-5cbe-4ef5-a2db-64f94391bf65-kube-api-access-bxxjs\") pod \"0b95bc66-5cbe-4ef5-a2db-64f94391bf65\" (UID: \"0b95bc66-5cbe-4ef5-a2db-64f94391bf65\") " Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.922223 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfmlj\" (UniqueName: \"kubernetes.io/projected/22612f67-a7c0-4c0c-9b45-3a2ba2ea7681-kube-api-access-vfmlj\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.924531 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b95bc66-5cbe-4ef5-a2db-64f94391bf65-kube-api-access-bxxjs" (OuterVolumeSpecName: "kube-api-access-bxxjs") pod "0b95bc66-5cbe-4ef5-a2db-64f94391bf65" (UID: "0b95bc66-5cbe-4ef5-a2db-64f94391bf65"). InnerVolumeSpecName "kube-api-access-bxxjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:18:49 crc kubenswrapper[4764]: I1001 16:18:49.925381 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/819d701c-51c3-4e8d-a2f4-a9e39b81d65b-kube-api-access-8l9c4" (OuterVolumeSpecName: "kube-api-access-8l9c4") pod "819d701c-51c3-4e8d-a2f4-a9e39b81d65b" (UID: "819d701c-51c3-4e8d-a2f4-a9e39b81d65b"). InnerVolumeSpecName "kube-api-access-8l9c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:18:50 crc kubenswrapper[4764]: I1001 16:18:50.024686 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxxjs\" (UniqueName: \"kubernetes.io/projected/0b95bc66-5cbe-4ef5-a2db-64f94391bf65-kube-api-access-bxxjs\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:50 crc kubenswrapper[4764]: I1001 16:18:50.024740 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l9c4\" (UniqueName: \"kubernetes.io/projected/819d701c-51c3-4e8d-a2f4-a9e39b81d65b-kube-api-access-8l9c4\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:50 crc kubenswrapper[4764]: I1001 16:18:50.675754 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vt657" Oct 01 16:18:50 crc kubenswrapper[4764]: I1001 16:18:50.675763 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-242lx" event={"ID":"24da24d1-d732-469a-839b-bc1aea3737d8","Type":"ContainerStarted","Data":"4f926bf754746465e73cf6dcb7941c46d315fd5b4b4fc76e1dfce2b33141c411"} Oct 01 16:18:50 crc kubenswrapper[4764]: I1001 16:18:50.676586 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4p5lt" Oct 01 16:18:50 crc kubenswrapper[4764]: I1001 16:18:50.715557 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-242lx" podStartSLOduration=20.379960182 podStartE2EDuration="30.715528172s" podCreationTimestamp="2025-10-01 16:18:20 +0000 UTC" firstStartedPulling="2025-10-01 16:18:39.307453035 +0000 UTC m=+982.307099870" lastFinishedPulling="2025-10-01 16:18:49.643021025 +0000 UTC m=+992.642667860" observedRunningTime="2025-10-01 16:18:50.697547449 +0000 UTC m=+993.697194284" watchObservedRunningTime="2025-10-01 16:18:50.715528172 +0000 UTC m=+993.715175037" Oct 01 16:18:51 crc kubenswrapper[4764]: I1001 16:18:51.913531 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:18:51 crc kubenswrapper[4764]: I1001 16:18:51.914109 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:18:55 crc kubenswrapper[4764]: I1001 16:18:55.754244 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7qnvc" event={"ID":"071b3286-64c8-4945-952e-3ba22f94e118","Type":"ContainerStarted","Data":"cecb434757cfeb57580889448b4d14a0327e24caa925d5bf70f67a33d376a0a9"} Oct 01 16:18:55 crc kubenswrapper[4764]: I1001 16:18:55.775589 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-7qnvc" podStartSLOduration=2.041924067 podStartE2EDuration="42.775566099s" podCreationTimestamp="2025-10-01 16:18:13 +0000 UTC" firstStartedPulling="2025-10-01 16:18:14.030820224 +0000 UTC m=+957.030467099" lastFinishedPulling="2025-10-01 16:18:54.764462296 +0000 UTC m=+997.764109131" observedRunningTime="2025-10-01 16:18:55.768271269 +0000 UTC m=+998.767918104" watchObservedRunningTime="2025-10-01 16:18:55.775566099 +0000 UTC m=+998.775212944" Oct 01 16:18:57 crc kubenswrapper[4764]: I1001 16:18:57.771216 4764 generic.go:334] "Generic (PLEG): container finished" podID="24da24d1-d732-469a-839b-bc1aea3737d8" containerID="4f926bf754746465e73cf6dcb7941c46d315fd5b4b4fc76e1dfce2b33141c411" exitCode=0 Oct 01 16:18:57 crc kubenswrapper[4764]: I1001 16:18:57.771303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-242lx" event={"ID":"24da24d1-d732-469a-839b-bc1aea3737d8","Type":"ContainerDied","Data":"4f926bf754746465e73cf6dcb7941c46d315fd5b4b4fc76e1dfce2b33141c411"} Oct 01 16:18:59 crc kubenswrapper[4764]: I1001 16:18:59.072759 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-242lx" Oct 01 16:18:59 crc kubenswrapper[4764]: I1001 16:18:59.177896 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24da24d1-d732-469a-839b-bc1aea3737d8-combined-ca-bundle\") pod \"24da24d1-d732-469a-839b-bc1aea3737d8\" (UID: \"24da24d1-d732-469a-839b-bc1aea3737d8\") " Oct 01 16:18:59 crc kubenswrapper[4764]: I1001 16:18:59.178037 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24da24d1-d732-469a-839b-bc1aea3737d8-config-data\") pod \"24da24d1-d732-469a-839b-bc1aea3737d8\" (UID: \"24da24d1-d732-469a-839b-bc1aea3737d8\") " Oct 01 16:18:59 crc kubenswrapper[4764]: I1001 16:18:59.178318 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thjsz\" (UniqueName: \"kubernetes.io/projected/24da24d1-d732-469a-839b-bc1aea3737d8-kube-api-access-thjsz\") pod \"24da24d1-d732-469a-839b-bc1aea3737d8\" (UID: \"24da24d1-d732-469a-839b-bc1aea3737d8\") " Oct 01 16:18:59 crc kubenswrapper[4764]: I1001 16:18:59.183752 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24da24d1-d732-469a-839b-bc1aea3737d8-kube-api-access-thjsz" (OuterVolumeSpecName: "kube-api-access-thjsz") pod "24da24d1-d732-469a-839b-bc1aea3737d8" (UID: "24da24d1-d732-469a-839b-bc1aea3737d8"). InnerVolumeSpecName "kube-api-access-thjsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:18:59 crc kubenswrapper[4764]: I1001 16:18:59.199692 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24da24d1-d732-469a-839b-bc1aea3737d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24da24d1-d732-469a-839b-bc1aea3737d8" (UID: "24da24d1-d732-469a-839b-bc1aea3737d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:18:59 crc kubenswrapper[4764]: I1001 16:18:59.251170 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24da24d1-d732-469a-839b-bc1aea3737d8-config-data" (OuterVolumeSpecName: "config-data") pod "24da24d1-d732-469a-839b-bc1aea3737d8" (UID: "24da24d1-d732-469a-839b-bc1aea3737d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:18:59 crc kubenswrapper[4764]: I1001 16:18:59.279811 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thjsz\" (UniqueName: \"kubernetes.io/projected/24da24d1-d732-469a-839b-bc1aea3737d8-kube-api-access-thjsz\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:59 crc kubenswrapper[4764]: I1001 16:18:59.279854 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24da24d1-d732-469a-839b-bc1aea3737d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:59 crc kubenswrapper[4764]: I1001 16:18:59.279867 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24da24d1-d732-469a-839b-bc1aea3737d8-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:18:59 crc kubenswrapper[4764]: I1001 16:18:59.805909 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-242lx" event={"ID":"24da24d1-d732-469a-839b-bc1aea3737d8","Type":"ContainerDied","Data":"68d280ffc5900715c4e93d2bfda0508d0c679c1996e5b323a915c7be587c2f5c"} Oct 01 16:18:59 crc kubenswrapper[4764]: I1001 16:18:59.806229 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68d280ffc5900715c4e93d2bfda0508d0c679c1996e5b323a915c7be587c2f5c" Oct 01 16:18:59 crc kubenswrapper[4764]: I1001 16:18:59.806333 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-242lx" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.078187 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lrgx9"] Oct 01 16:19:00 crc kubenswrapper[4764]: E1001 16:19:00.078767 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="819d701c-51c3-4e8d-a2f4-a9e39b81d65b" containerName="mariadb-database-create" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.078785 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="819d701c-51c3-4e8d-a2f4-a9e39b81d65b" containerName="mariadb-database-create" Oct 01 16:19:00 crc kubenswrapper[4764]: E1001 16:19:00.078800 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b95bc66-5cbe-4ef5-a2db-64f94391bf65" containerName="mariadb-database-create" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.078807 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b95bc66-5cbe-4ef5-a2db-64f94391bf65" containerName="mariadb-database-create" Oct 01 16:19:00 crc kubenswrapper[4764]: E1001 16:19:00.078817 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24da24d1-d732-469a-839b-bc1aea3737d8" containerName="keystone-db-sync" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.078824 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="24da24d1-d732-469a-839b-bc1aea3737d8" containerName="keystone-db-sync" Oct 01 16:19:00 crc kubenswrapper[4764]: E1001 16:19:00.078849 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22612f67-a7c0-4c0c-9b45-3a2ba2ea7681" containerName="mariadb-database-create" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.078855 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="22612f67-a7c0-4c0c-9b45-3a2ba2ea7681" containerName="mariadb-database-create" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.079057 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b95bc66-5cbe-4ef5-a2db-64f94391bf65" containerName="mariadb-database-create" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.079068 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="22612f67-a7c0-4c0c-9b45-3a2ba2ea7681" containerName="mariadb-database-create" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.079081 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="24da24d1-d732-469a-839b-bc1aea3737d8" containerName="keystone-db-sync" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.079098 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="819d701c-51c3-4e8d-a2f4-a9e39b81d65b" containerName="mariadb-database-create" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.079959 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.083079 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9lshv" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.083353 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.083517 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.083680 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.092795 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-fzvzf"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.094763 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.101781 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lrgx9"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.129469 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-fzvzf"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.195499 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-fernet-keys\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.195560 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-config-data\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.196210 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdpm9\" (UniqueName: \"kubernetes.io/projected/98c01c02-7f0f-450b-ad25-336b9f80e97d-kube-api-access-zdpm9\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.196741 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.196825 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.196866 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-credential-keys\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.196930 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-config\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.196989 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5xfj\" (UniqueName: \"kubernetes.io/projected/23db45bf-66ff-4806-8e6d-9d4e836bcb89-kube-api-access-w5xfj\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.197145 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-combined-ca-bundle\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.197189 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-scripts\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.197220 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.266421 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.268657 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.271296 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.271751 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.289775 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.298807 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdpm9\" (UniqueName: \"kubernetes.io/projected/98c01c02-7f0f-450b-ad25-336b9f80e97d-kube-api-access-zdpm9\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.298861 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76tpp\" (UniqueName: \"kubernetes.io/projected/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-kube-api-access-76tpp\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.298908 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.298964 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.298992 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-log-httpd\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.299024 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-credential-keys\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.299076 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-config\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.299118 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-config-data\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.299154 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5xfj\" (UniqueName: \"kubernetes.io/projected/23db45bf-66ff-4806-8e6d-9d4e836bcb89-kube-api-access-w5xfj\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.299198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.299225 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-combined-ca-bundle\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.299249 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-scripts\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.299282 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.299326 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-fernet-keys\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.299361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-config-data\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.299415 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.299474 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-run-httpd\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.299514 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-scripts\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.300656 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-config\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.301367 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.301376 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.301905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.312270 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-fernet-keys\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.314651 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-credential-keys\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.314677 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-config-data\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.314884 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-scripts\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.316696 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-combined-ca-bundle\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.324702 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5xfj\" (UniqueName: \"kubernetes.io/projected/23db45bf-66ff-4806-8e6d-9d4e836bcb89-kube-api-access-w5xfj\") pod \"keystone-bootstrap-lrgx9\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.348905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdpm9\" (UniqueName: \"kubernetes.io/projected/98c01c02-7f0f-450b-ad25-336b9f80e97d-kube-api-access-zdpm9\") pod \"dnsmasq-dns-75bb4695fc-fzvzf\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.400368 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-config-data\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.400417 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.400464 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.400491 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-run-httpd\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.400511 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-scripts\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.400538 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76tpp\" (UniqueName: \"kubernetes.io/projected/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-kube-api-access-76tpp\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.400568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-log-httpd\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.400908 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-log-httpd\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.401223 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-run-httpd\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.403631 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-config-data\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.410792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.412294 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.414691 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-scripts\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.416496 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.420536 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.434096 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-fzvzf"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.437443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76tpp\" (UniqueName: \"kubernetes.io/projected/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-kube-api-access-76tpp\") pod \"ceilometer-0\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.451876 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-n8qfb"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.454086 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.458722 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.458903 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9rvt9" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.459008 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.459172 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-n8qfb"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.466531 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-wswz5"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.467865 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.473798 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-wswz5"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.501142 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.501184 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.501227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-scripts\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.501259 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.501295 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-config\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.501314 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdmn6\" (UniqueName: \"kubernetes.io/projected/0c44de1b-4886-4c2b-a57c-a234a882e4a6-kube-api-access-fdmn6\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.501334 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c44de1b-4886-4c2b-a57c-a234a882e4a6-logs\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.501352 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-combined-ca-bundle\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.501389 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4dfb\" (UniqueName: \"kubernetes.io/projected/cf6e5841-6e54-4901-8504-9fde292244ab-kube-api-access-d4dfb\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.501409 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-config-data\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.583487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.602318 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4dfb\" (UniqueName: \"kubernetes.io/projected/cf6e5841-6e54-4901-8504-9fde292244ab-kube-api-access-d4dfb\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.602359 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-config-data\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.602376 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.602396 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.602433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-scripts\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.602455 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.602491 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-config\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.602510 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdmn6\" (UniqueName: \"kubernetes.io/projected/0c44de1b-4886-4c2b-a57c-a234a882e4a6-kube-api-access-fdmn6\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.602529 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c44de1b-4886-4c2b-a57c-a234a882e4a6-logs\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.602545 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-combined-ca-bundle\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.603927 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.606561 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-combined-ca-bundle\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.606588 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-scripts\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.607198 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.607353 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.607554 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c44de1b-4886-4c2b-a57c-a234a882e4a6-logs\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.607908 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-config\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.610927 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-config-data\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.633192 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdmn6\" (UniqueName: \"kubernetes.io/projected/0c44de1b-4886-4c2b-a57c-a234a882e4a6-kube-api-access-fdmn6\") pod \"placement-db-sync-n8qfb\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.633344 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4dfb\" (UniqueName: \"kubernetes.io/projected/cf6e5841-6e54-4901-8504-9fde292244ab-kube-api-access-d4dfb\") pod \"dnsmasq-dns-745b9ddc8c-wswz5\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.780815 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.785667 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a401-account-create-pqkft"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.786629 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a401-account-create-pqkft" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.788598 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.794338 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a401-account-create-pqkft"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.804493 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.884556 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-aee5-account-create-sb84m"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.893852 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aee5-account-create-sb84m"] Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.893952 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aee5-account-create-sb84m" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.896764 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.911727 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkjtf\" (UniqueName: \"kubernetes.io/projected/4b49f677-c2a8-4637-a63f-93f382e73b92-kube-api-access-bkjtf\") pod \"barbican-aee5-account-create-sb84m\" (UID: \"4b49f677-c2a8-4637-a63f-93f382e73b92\") " pod="openstack/barbican-aee5-account-create-sb84m" Oct 01 16:19:00 crc kubenswrapper[4764]: I1001 16:19:00.911818 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf9ws\" (UniqueName: \"kubernetes.io/projected/dac4ce93-5f70-4bc7-8eb6-533aa845be98-kube-api-access-sf9ws\") pod \"cinder-a401-account-create-pqkft\" (UID: \"dac4ce93-5f70-4bc7-8eb6-533aa845be98\") " pod="openstack/cinder-a401-account-create-pqkft" Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.008621 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lrgx9"] Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.013154 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf9ws\" (UniqueName: \"kubernetes.io/projected/dac4ce93-5f70-4bc7-8eb6-533aa845be98-kube-api-access-sf9ws\") pod \"cinder-a401-account-create-pqkft\" (UID: \"dac4ce93-5f70-4bc7-8eb6-533aa845be98\") " pod="openstack/cinder-a401-account-create-pqkft" Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.013261 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkjtf\" (UniqueName: \"kubernetes.io/projected/4b49f677-c2a8-4637-a63f-93f382e73b92-kube-api-access-bkjtf\") pod \"barbican-aee5-account-create-sb84m\" (UID: \"4b49f677-c2a8-4637-a63f-93f382e73b92\") " pod="openstack/barbican-aee5-account-create-sb84m" Oct 01 16:19:01 crc kubenswrapper[4764]: W1001 16:19:01.014295 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23db45bf_66ff_4806_8e6d_9d4e836bcb89.slice/crio-56ef9c5937f4849c4fc3f1c577f28ed808dec5dd75ddc201a5ec0b714ac6cfba WatchSource:0}: Error finding container 56ef9c5937f4849c4fc3f1c577f28ed808dec5dd75ddc201a5ec0b714ac6cfba: Status 404 returned error can't find the container with id 56ef9c5937f4849c4fc3f1c577f28ed808dec5dd75ddc201a5ec0b714ac6cfba Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.034172 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf9ws\" (UniqueName: \"kubernetes.io/projected/dac4ce93-5f70-4bc7-8eb6-533aa845be98-kube-api-access-sf9ws\") pod \"cinder-a401-account-create-pqkft\" (UID: \"dac4ce93-5f70-4bc7-8eb6-533aa845be98\") " pod="openstack/cinder-a401-account-create-pqkft" Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.034676 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkjtf\" (UniqueName: \"kubernetes.io/projected/4b49f677-c2a8-4637-a63f-93f382e73b92-kube-api-access-bkjtf\") pod \"barbican-aee5-account-create-sb84m\" (UID: \"4b49f677-c2a8-4637-a63f-93f382e73b92\") " pod="openstack/barbican-aee5-account-create-sb84m" Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.086655 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-fzvzf"] Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.106130 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a401-account-create-pqkft" Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.188495 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1d44-account-create-54jl9"] Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.189637 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1d44-account-create-54jl9" Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.192461 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.203021 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.212395 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1d44-account-create-54jl9"] Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.228380 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aee5-account-create-sb84m" Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.316880 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrl6n\" (UniqueName: \"kubernetes.io/projected/9e62d88c-83b8-4ab4-8ad9-f231e99b83fe-kube-api-access-rrl6n\") pod \"neutron-1d44-account-create-54jl9\" (UID: \"9e62d88c-83b8-4ab4-8ad9-f231e99b83fe\") " pod="openstack/neutron-1d44-account-create-54jl9" Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.377984 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-n8qfb"] Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.399766 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-wswz5"] Oct 01 16:19:01 crc kubenswrapper[4764]: W1001 16:19:01.405752 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf6e5841_6e54_4901_8504_9fde292244ab.slice/crio-9a3b458912b9f502cab1bb5a7b16df7eb0328ff509cccefcd3ffcc2e48fc87df WatchSource:0}: Error finding container 9a3b458912b9f502cab1bb5a7b16df7eb0328ff509cccefcd3ffcc2e48fc87df: Status 404 returned error can't find the container with id 9a3b458912b9f502cab1bb5a7b16df7eb0328ff509cccefcd3ffcc2e48fc87df Oct 01 16:19:01 crc kubenswrapper[4764]: W1001 16:19:01.410079 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c44de1b_4886_4c2b_a57c_a234a882e4a6.slice/crio-a669a3716b144b610f189033196756e5e8490da544bdf5f9154bfaa2536c1d91 WatchSource:0}: Error finding container a669a3716b144b610f189033196756e5e8490da544bdf5f9154bfaa2536c1d91: Status 404 returned error can't find the container with id a669a3716b144b610f189033196756e5e8490da544bdf5f9154bfaa2536c1d91 Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.418745 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrl6n\" (UniqueName: \"kubernetes.io/projected/9e62d88c-83b8-4ab4-8ad9-f231e99b83fe-kube-api-access-rrl6n\") pod \"neutron-1d44-account-create-54jl9\" (UID: \"9e62d88c-83b8-4ab4-8ad9-f231e99b83fe\") " pod="openstack/neutron-1d44-account-create-54jl9" Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.445944 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrl6n\" (UniqueName: \"kubernetes.io/projected/9e62d88c-83b8-4ab4-8ad9-f231e99b83fe-kube-api-access-rrl6n\") pod \"neutron-1d44-account-create-54jl9\" (UID: \"9e62d88c-83b8-4ab4-8ad9-f231e99b83fe\") " pod="openstack/neutron-1d44-account-create-54jl9" Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.591114 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1d44-account-create-54jl9" Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.649085 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a401-account-create-pqkft"] Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.784809 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-aee5-account-create-sb84m"] Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.831195 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" event={"ID":"cf6e5841-6e54-4901-8504-9fde292244ab","Type":"ContainerStarted","Data":"ddf6b5178520dec74392eb768af8a06f199a811b0bfef63afaecc85c2d8ea291"} Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.831239 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" event={"ID":"cf6e5841-6e54-4901-8504-9fde292244ab","Type":"ContainerStarted","Data":"9a3b458912b9f502cab1bb5a7b16df7eb0328ff509cccefcd3ffcc2e48fc87df"} Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.832699 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36","Type":"ContainerStarted","Data":"33f5886a93b0c0196860535fd7934dbcc69fb09d8ef6473018649230834bdb17"} Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.837023 4764 generic.go:334] "Generic (PLEG): container finished" podID="98c01c02-7f0f-450b-ad25-336b9f80e97d" containerID="112ad572c607b9d50c0e489271dfbe0c0ded01eaf4adeab1bf04172acded486f" exitCode=0 Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.837169 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" event={"ID":"98c01c02-7f0f-450b-ad25-336b9f80e97d","Type":"ContainerDied","Data":"112ad572c607b9d50c0e489271dfbe0c0ded01eaf4adeab1bf04172acded486f"} Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.837199 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" event={"ID":"98c01c02-7f0f-450b-ad25-336b9f80e97d","Type":"ContainerStarted","Data":"f8a1be941458f7175ce05bed0dcc5bd50b50a1ced196004d52924f1a2d394ed4"} Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.841662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n8qfb" event={"ID":"0c44de1b-4886-4c2b-a57c-a234a882e4a6","Type":"ContainerStarted","Data":"a669a3716b144b610f189033196756e5e8490da544bdf5f9154bfaa2536c1d91"} Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.843169 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lrgx9" event={"ID":"23db45bf-66ff-4806-8e6d-9d4e836bcb89","Type":"ContainerStarted","Data":"21abde22172561a77c076a728487da7636033b20ec7df2a9d7bc3b4386028293"} Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.843215 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lrgx9" event={"ID":"23db45bf-66ff-4806-8e6d-9d4e836bcb89","Type":"ContainerStarted","Data":"56ef9c5937f4849c4fc3f1c577f28ed808dec5dd75ddc201a5ec0b714ac6cfba"} Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.855428 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a401-account-create-pqkft" event={"ID":"dac4ce93-5f70-4bc7-8eb6-533aa845be98","Type":"ContainerStarted","Data":"ec039382cda3ef89c45670eb11d756674687c563828720a14b35ddf53685cf67"} Oct 01 16:19:01 crc kubenswrapper[4764]: I1001 16:19:01.877526 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lrgx9" podStartSLOduration=1.877506227 podStartE2EDuration="1.877506227s" podCreationTimestamp="2025-10-01 16:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:19:01.868189048 +0000 UTC m=+1004.867835993" watchObservedRunningTime="2025-10-01 16:19:01.877506227 +0000 UTC m=+1004.877153062" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.083783 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1d44-account-create-54jl9"] Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.118642 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.278806 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:02 crc kubenswrapper[4764]: E1001 16:19:02.320919 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b49f677_c2a8_4637_a63f_93f382e73b92.slice/crio-b740fd1f3a00f767c5ff228b40b5c7b4c7955318075b851da3594e28395ce1b2.scope\": RecentStats: unable to find data in memory cache]" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.366010 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-config\") pod \"98c01c02-7f0f-450b-ad25-336b9f80e97d\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.366379 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-dns-svc\") pod \"98c01c02-7f0f-450b-ad25-336b9f80e97d\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.366413 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-ovsdbserver-sb\") pod \"98c01c02-7f0f-450b-ad25-336b9f80e97d\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.366458 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-ovsdbserver-nb\") pod \"98c01c02-7f0f-450b-ad25-336b9f80e97d\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.366561 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdpm9\" (UniqueName: \"kubernetes.io/projected/98c01c02-7f0f-450b-ad25-336b9f80e97d-kube-api-access-zdpm9\") pod \"98c01c02-7f0f-450b-ad25-336b9f80e97d\" (UID: \"98c01c02-7f0f-450b-ad25-336b9f80e97d\") " Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.373242 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c01c02-7f0f-450b-ad25-336b9f80e97d-kube-api-access-zdpm9" (OuterVolumeSpecName: "kube-api-access-zdpm9") pod "98c01c02-7f0f-450b-ad25-336b9f80e97d" (UID: "98c01c02-7f0f-450b-ad25-336b9f80e97d"). InnerVolumeSpecName "kube-api-access-zdpm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.388624 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-config" (OuterVolumeSpecName: "config") pod "98c01c02-7f0f-450b-ad25-336b9f80e97d" (UID: "98c01c02-7f0f-450b-ad25-336b9f80e97d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.388665 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "98c01c02-7f0f-450b-ad25-336b9f80e97d" (UID: "98c01c02-7f0f-450b-ad25-336b9f80e97d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.394627 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "98c01c02-7f0f-450b-ad25-336b9f80e97d" (UID: "98c01c02-7f0f-450b-ad25-336b9f80e97d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.395844 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "98c01c02-7f0f-450b-ad25-336b9f80e97d" (UID: "98c01c02-7f0f-450b-ad25-336b9f80e97d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.468343 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.468376 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.468386 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.468397 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98c01c02-7f0f-450b-ad25-336b9f80e97d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.468407 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdpm9\" (UniqueName: \"kubernetes.io/projected/98c01c02-7f0f-450b-ad25-336b9f80e97d-kube-api-access-zdpm9\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.866911 4764 generic.go:334] "Generic (PLEG): container finished" podID="9e62d88c-83b8-4ab4-8ad9-f231e99b83fe" containerID="8aae9048d9df1908aec76735625fd07337d605eb944d97a919c0fc9ce183c22c" exitCode=0 Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.866983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1d44-account-create-54jl9" event={"ID":"9e62d88c-83b8-4ab4-8ad9-f231e99b83fe","Type":"ContainerDied","Data":"8aae9048d9df1908aec76735625fd07337d605eb944d97a919c0fc9ce183c22c"} Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.867010 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1d44-account-create-54jl9" event={"ID":"9e62d88c-83b8-4ab4-8ad9-f231e99b83fe","Type":"ContainerStarted","Data":"5d8517ae8b716ed845f49bec76214b75f07c0235d8b1eddb908c73c0b7636db4"} Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.868616 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf6e5841-6e54-4901-8504-9fde292244ab" containerID="ddf6b5178520dec74392eb768af8a06f199a811b0bfef63afaecc85c2d8ea291" exitCode=0 Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.868717 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" event={"ID":"cf6e5841-6e54-4901-8504-9fde292244ab","Type":"ContainerDied","Data":"ddf6b5178520dec74392eb768af8a06f199a811b0bfef63afaecc85c2d8ea291"} Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.870647 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" event={"ID":"98c01c02-7f0f-450b-ad25-336b9f80e97d","Type":"ContainerDied","Data":"f8a1be941458f7175ce05bed0dcc5bd50b50a1ced196004d52924f1a2d394ed4"} Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.870710 4764 scope.go:117] "RemoveContainer" containerID="112ad572c607b9d50c0e489271dfbe0c0ded01eaf4adeab1bf04172acded486f" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.870882 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-fzvzf" Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.875373 4764 generic.go:334] "Generic (PLEG): container finished" podID="4b49f677-c2a8-4637-a63f-93f382e73b92" containerID="b740fd1f3a00f767c5ff228b40b5c7b4c7955318075b851da3594e28395ce1b2" exitCode=0 Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.875718 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aee5-account-create-sb84m" event={"ID":"4b49f677-c2a8-4637-a63f-93f382e73b92","Type":"ContainerDied","Data":"b740fd1f3a00f767c5ff228b40b5c7b4c7955318075b851da3594e28395ce1b2"} Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.875778 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aee5-account-create-sb84m" event={"ID":"4b49f677-c2a8-4637-a63f-93f382e73b92","Type":"ContainerStarted","Data":"c492e458354128838787ad9d900f3a26302bfb6531a89cdcf13a51aa48044041"} Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.888120 4764 generic.go:334] "Generic (PLEG): container finished" podID="dac4ce93-5f70-4bc7-8eb6-533aa845be98" containerID="2630a8464bfcbfc8eb1556493963ba1a7431254312e294b4cd55e42ef4474e09" exitCode=0 Oct 01 16:19:02 crc kubenswrapper[4764]: I1001 16:19:02.892817 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a401-account-create-pqkft" event={"ID":"dac4ce93-5f70-4bc7-8eb6-533aa845be98","Type":"ContainerDied","Data":"2630a8464bfcbfc8eb1556493963ba1a7431254312e294b4cd55e42ef4474e09"} Oct 01 16:19:03 crc kubenswrapper[4764]: I1001 16:19:03.027862 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-fzvzf"] Oct 01 16:19:03 crc kubenswrapper[4764]: I1001 16:19:03.027919 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-fzvzf"] Oct 01 16:19:03 crc kubenswrapper[4764]: I1001 16:19:03.730976 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c01c02-7f0f-450b-ad25-336b9f80e97d" path="/var/lib/kubelet/pods/98c01c02-7f0f-450b-ad25-336b9f80e97d/volumes" Oct 01 16:19:03 crc kubenswrapper[4764]: I1001 16:19:03.897901 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" event={"ID":"cf6e5841-6e54-4901-8504-9fde292244ab","Type":"ContainerStarted","Data":"666984ffa11dbb84c93f93b9c576e7ea9fd99700dfed10bfb73669d09b5494ad"} Oct 01 16:19:03 crc kubenswrapper[4764]: I1001 16:19:03.898434 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:03 crc kubenswrapper[4764]: I1001 16:19:03.918099 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" podStartSLOduration=3.918022727 podStartE2EDuration="3.918022727s" podCreationTimestamp="2025-10-01 16:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:19:03.915703439 +0000 UTC m=+1006.915350294" watchObservedRunningTime="2025-10-01 16:19:03.918022727 +0000 UTC m=+1006.917669562" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.330344 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aee5-account-create-sb84m" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.335766 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a401-account-create-pqkft" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.345239 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1d44-account-create-54jl9" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.411804 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkjtf\" (UniqueName: \"kubernetes.io/projected/4b49f677-c2a8-4637-a63f-93f382e73b92-kube-api-access-bkjtf\") pod \"4b49f677-c2a8-4637-a63f-93f382e73b92\" (UID: \"4b49f677-c2a8-4637-a63f-93f382e73b92\") " Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.411875 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrl6n\" (UniqueName: \"kubernetes.io/projected/9e62d88c-83b8-4ab4-8ad9-f231e99b83fe-kube-api-access-rrl6n\") pod \"9e62d88c-83b8-4ab4-8ad9-f231e99b83fe\" (UID: \"9e62d88c-83b8-4ab4-8ad9-f231e99b83fe\") " Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.411960 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf9ws\" (UniqueName: \"kubernetes.io/projected/dac4ce93-5f70-4bc7-8eb6-533aa845be98-kube-api-access-sf9ws\") pod \"dac4ce93-5f70-4bc7-8eb6-533aa845be98\" (UID: \"dac4ce93-5f70-4bc7-8eb6-533aa845be98\") " Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.418390 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b49f677-c2a8-4637-a63f-93f382e73b92-kube-api-access-bkjtf" (OuterVolumeSpecName: "kube-api-access-bkjtf") pod "4b49f677-c2a8-4637-a63f-93f382e73b92" (UID: "4b49f677-c2a8-4637-a63f-93f382e73b92"). InnerVolumeSpecName "kube-api-access-bkjtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.418441 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac4ce93-5f70-4bc7-8eb6-533aa845be98-kube-api-access-sf9ws" (OuterVolumeSpecName: "kube-api-access-sf9ws") pod "dac4ce93-5f70-4bc7-8eb6-533aa845be98" (UID: "dac4ce93-5f70-4bc7-8eb6-533aa845be98"). InnerVolumeSpecName "kube-api-access-sf9ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.420211 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e62d88c-83b8-4ab4-8ad9-f231e99b83fe-kube-api-access-rrl6n" (OuterVolumeSpecName: "kube-api-access-rrl6n") pod "9e62d88c-83b8-4ab4-8ad9-f231e99b83fe" (UID: "9e62d88c-83b8-4ab4-8ad9-f231e99b83fe"). InnerVolumeSpecName "kube-api-access-rrl6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.513987 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkjtf\" (UniqueName: \"kubernetes.io/projected/4b49f677-c2a8-4637-a63f-93f382e73b92-kube-api-access-bkjtf\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.514016 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrl6n\" (UniqueName: \"kubernetes.io/projected/9e62d88c-83b8-4ab4-8ad9-f231e99b83fe-kube-api-access-rrl6n\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.514029 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf9ws\" (UniqueName: \"kubernetes.io/projected/dac4ce93-5f70-4bc7-8eb6-533aa845be98-kube-api-access-sf9ws\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.910852 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a401-account-create-pqkft" event={"ID":"dac4ce93-5f70-4bc7-8eb6-533aa845be98","Type":"ContainerDied","Data":"ec039382cda3ef89c45670eb11d756674687c563828720a14b35ddf53685cf67"} Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.911081 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec039382cda3ef89c45670eb11d756674687c563828720a14b35ddf53685cf67" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.911136 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a401-account-create-pqkft" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.920939 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1d44-account-create-54jl9" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.920967 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1d44-account-create-54jl9" event={"ID":"9e62d88c-83b8-4ab4-8ad9-f231e99b83fe","Type":"ContainerDied","Data":"5d8517ae8b716ed845f49bec76214b75f07c0235d8b1eddb908c73c0b7636db4"} Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.921014 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d8517ae8b716ed845f49bec76214b75f07c0235d8b1eddb908c73c0b7636db4" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.922134 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-aee5-account-create-sb84m" event={"ID":"4b49f677-c2a8-4637-a63f-93f382e73b92","Type":"ContainerDied","Data":"c492e458354128838787ad9d900f3a26302bfb6531a89cdcf13a51aa48044041"} Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.922170 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c492e458354128838787ad9d900f3a26302bfb6531a89cdcf13a51aa48044041" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.922179 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-aee5-account-create-sb84m" Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.923233 4764 generic.go:334] "Generic (PLEG): container finished" podID="071b3286-64c8-4945-952e-3ba22f94e118" containerID="cecb434757cfeb57580889448b4d14a0327e24caa925d5bf70f67a33d376a0a9" exitCode=0 Oct 01 16:19:04 crc kubenswrapper[4764]: I1001 16:19:04.923396 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7qnvc" event={"ID":"071b3286-64c8-4945-952e-3ba22f94e118","Type":"ContainerDied","Data":"cecb434757cfeb57580889448b4d14a0327e24caa925d5bf70f67a33d376a0a9"} Oct 01 16:19:05 crc kubenswrapper[4764]: I1001 16:19:05.947926 4764 generic.go:334] "Generic (PLEG): container finished" podID="23db45bf-66ff-4806-8e6d-9d4e836bcb89" containerID="21abde22172561a77c076a728487da7636033b20ec7df2a9d7bc3b4386028293" exitCode=0 Oct 01 16:19:05 crc kubenswrapper[4764]: I1001 16:19:05.948127 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lrgx9" event={"ID":"23db45bf-66ff-4806-8e6d-9d4e836bcb89","Type":"ContainerDied","Data":"21abde22172561a77c076a728487da7636033b20ec7df2a9d7bc3b4386028293"} Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.150714 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-dkmsh"] Oct 01 16:19:06 crc kubenswrapper[4764]: E1001 16:19:06.151272 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac4ce93-5f70-4bc7-8eb6-533aa845be98" containerName="mariadb-account-create" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.151284 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac4ce93-5f70-4bc7-8eb6-533aa845be98" containerName="mariadb-account-create" Oct 01 16:19:06 crc kubenswrapper[4764]: E1001 16:19:06.151293 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c01c02-7f0f-450b-ad25-336b9f80e97d" containerName="init" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.151299 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c01c02-7f0f-450b-ad25-336b9f80e97d" containerName="init" Oct 01 16:19:06 crc kubenswrapper[4764]: E1001 16:19:06.151314 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b49f677-c2a8-4637-a63f-93f382e73b92" containerName="mariadb-account-create" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.151320 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b49f677-c2a8-4637-a63f-93f382e73b92" containerName="mariadb-account-create" Oct 01 16:19:06 crc kubenswrapper[4764]: E1001 16:19:06.151330 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e62d88c-83b8-4ab4-8ad9-f231e99b83fe" containerName="mariadb-account-create" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.151335 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e62d88c-83b8-4ab4-8ad9-f231e99b83fe" containerName="mariadb-account-create" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.151495 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c01c02-7f0f-450b-ad25-336b9f80e97d" containerName="init" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.151506 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac4ce93-5f70-4bc7-8eb6-533aa845be98" containerName="mariadb-account-create" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.151519 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b49f677-c2a8-4637-a63f-93f382e73b92" containerName="mariadb-account-create" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.151529 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e62d88c-83b8-4ab4-8ad9-f231e99b83fe" containerName="mariadb-account-create" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.152004 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.154753 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.154902 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.155126 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-p6tk8" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.156604 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dkmsh"] Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.247596 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd7s7\" (UniqueName: \"kubernetes.io/projected/44b571b7-d584-46bf-823a-bf8ce35c8dac-kube-api-access-bd7s7\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.247720 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-combined-ca-bundle\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.247754 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-config-data\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.247788 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-scripts\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.247906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44b571b7-d584-46bf-823a-bf8ce35c8dac-etc-machine-id\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.247998 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-db-sync-config-data\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.248229 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-7hhkr"] Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.249954 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7hhkr" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.253008 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.254148 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xp6sm" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.254403 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7hhkr"] Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.349137 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44b571b7-d584-46bf-823a-bf8ce35c8dac-etc-machine-id\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.349189 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krsbn\" (UniqueName: \"kubernetes.io/projected/c087a812-124b-496c-afe6-b8ba3ca79ada-kube-api-access-krsbn\") pod \"barbican-db-sync-7hhkr\" (UID: \"c087a812-124b-496c-afe6-b8ba3ca79ada\") " pod="openstack/barbican-db-sync-7hhkr" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.349223 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-db-sync-config-data\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.349298 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c087a812-124b-496c-afe6-b8ba3ca79ada-db-sync-config-data\") pod \"barbican-db-sync-7hhkr\" (UID: \"c087a812-124b-496c-afe6-b8ba3ca79ada\") " pod="openstack/barbican-db-sync-7hhkr" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.349320 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd7s7\" (UniqueName: \"kubernetes.io/projected/44b571b7-d584-46bf-823a-bf8ce35c8dac-kube-api-access-bd7s7\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.349360 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-combined-ca-bundle\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.349376 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-config-data\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.349397 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-scripts\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.349420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c087a812-124b-496c-afe6-b8ba3ca79ada-combined-ca-bundle\") pod \"barbican-db-sync-7hhkr\" (UID: \"c087a812-124b-496c-afe6-b8ba3ca79ada\") " pod="openstack/barbican-db-sync-7hhkr" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.349507 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44b571b7-d584-46bf-823a-bf8ce35c8dac-etc-machine-id\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.361458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-db-sync-config-data\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.361494 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-scripts\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.361788 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-config-data\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.362154 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-combined-ca-bundle\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.371759 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd7s7\" (UniqueName: \"kubernetes.io/projected/44b571b7-d584-46bf-823a-bf8ce35c8dac-kube-api-access-bd7s7\") pod \"cinder-db-sync-dkmsh\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.437851 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vhgdf"] Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.438876 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vhgdf" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.440869 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.441172 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.443019 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vgmcf" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.443550 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7qnvc" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.450598 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c087a812-124b-496c-afe6-b8ba3ca79ada-combined-ca-bundle\") pod \"barbican-db-sync-7hhkr\" (UID: \"c087a812-124b-496c-afe6-b8ba3ca79ada\") " pod="openstack/barbican-db-sync-7hhkr" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.450672 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krsbn\" (UniqueName: \"kubernetes.io/projected/c087a812-124b-496c-afe6-b8ba3ca79ada-kube-api-access-krsbn\") pod \"barbican-db-sync-7hhkr\" (UID: \"c087a812-124b-496c-afe6-b8ba3ca79ada\") " pod="openstack/barbican-db-sync-7hhkr" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.450753 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c087a812-124b-496c-afe6-b8ba3ca79ada-db-sync-config-data\") pod \"barbican-db-sync-7hhkr\" (UID: \"c087a812-124b-496c-afe6-b8ba3ca79ada\") " pod="openstack/barbican-db-sync-7hhkr" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.487022 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.489179 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c087a812-124b-496c-afe6-b8ba3ca79ada-db-sync-config-data\") pod \"barbican-db-sync-7hhkr\" (UID: \"c087a812-124b-496c-afe6-b8ba3ca79ada\") " pod="openstack/barbican-db-sync-7hhkr" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.493302 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vhgdf"] Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.494144 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c087a812-124b-496c-afe6-b8ba3ca79ada-combined-ca-bundle\") pod \"barbican-db-sync-7hhkr\" (UID: \"c087a812-124b-496c-afe6-b8ba3ca79ada\") " pod="openstack/barbican-db-sync-7hhkr" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.510574 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krsbn\" (UniqueName: \"kubernetes.io/projected/c087a812-124b-496c-afe6-b8ba3ca79ada-kube-api-access-krsbn\") pod \"barbican-db-sync-7hhkr\" (UID: \"c087a812-124b-496c-afe6-b8ba3ca79ada\") " pod="openstack/barbican-db-sync-7hhkr" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.551512 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-config-data\") pod \"071b3286-64c8-4945-952e-3ba22f94e118\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.551825 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-combined-ca-bundle\") pod \"071b3286-64c8-4945-952e-3ba22f94e118\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.551926 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-db-sync-config-data\") pod \"071b3286-64c8-4945-952e-3ba22f94e118\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.551996 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rzdc\" (UniqueName: \"kubernetes.io/projected/071b3286-64c8-4945-952e-3ba22f94e118-kube-api-access-4rzdc\") pod \"071b3286-64c8-4945-952e-3ba22f94e118\" (UID: \"071b3286-64c8-4945-952e-3ba22f94e118\") " Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.552194 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b75f147-9726-4336-8467-932ad4ff15f1-combined-ca-bundle\") pod \"neutron-db-sync-vhgdf\" (UID: \"7b75f147-9726-4336-8467-932ad4ff15f1\") " pod="openstack/neutron-db-sync-vhgdf" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.552254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txvs5\" (UniqueName: \"kubernetes.io/projected/7b75f147-9726-4336-8467-932ad4ff15f1-kube-api-access-txvs5\") pod \"neutron-db-sync-vhgdf\" (UID: \"7b75f147-9726-4336-8467-932ad4ff15f1\") " pod="openstack/neutron-db-sync-vhgdf" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.552378 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b75f147-9726-4336-8467-932ad4ff15f1-config\") pod \"neutron-db-sync-vhgdf\" (UID: \"7b75f147-9726-4336-8467-932ad4ff15f1\") " pod="openstack/neutron-db-sync-vhgdf" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.558348 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/071b3286-64c8-4945-952e-3ba22f94e118-kube-api-access-4rzdc" (OuterVolumeSpecName: "kube-api-access-4rzdc") pod "071b3286-64c8-4945-952e-3ba22f94e118" (UID: "071b3286-64c8-4945-952e-3ba22f94e118"). InnerVolumeSpecName "kube-api-access-4rzdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.567562 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "071b3286-64c8-4945-952e-3ba22f94e118" (UID: "071b3286-64c8-4945-952e-3ba22f94e118"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.570914 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7hhkr" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.578349 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "071b3286-64c8-4945-952e-3ba22f94e118" (UID: "071b3286-64c8-4945-952e-3ba22f94e118"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.629936 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-config-data" (OuterVolumeSpecName: "config-data") pod "071b3286-64c8-4945-952e-3ba22f94e118" (UID: "071b3286-64c8-4945-952e-3ba22f94e118"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.654563 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b75f147-9726-4336-8467-932ad4ff15f1-combined-ca-bundle\") pod \"neutron-db-sync-vhgdf\" (UID: \"7b75f147-9726-4336-8467-932ad4ff15f1\") " pod="openstack/neutron-db-sync-vhgdf" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.654622 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txvs5\" (UniqueName: \"kubernetes.io/projected/7b75f147-9726-4336-8467-932ad4ff15f1-kube-api-access-txvs5\") pod \"neutron-db-sync-vhgdf\" (UID: \"7b75f147-9726-4336-8467-932ad4ff15f1\") " pod="openstack/neutron-db-sync-vhgdf" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.654710 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b75f147-9726-4336-8467-932ad4ff15f1-config\") pod \"neutron-db-sync-vhgdf\" (UID: \"7b75f147-9726-4336-8467-932ad4ff15f1\") " pod="openstack/neutron-db-sync-vhgdf" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.654803 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.654819 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.654844 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rzdc\" (UniqueName: \"kubernetes.io/projected/071b3286-64c8-4945-952e-3ba22f94e118-kube-api-access-4rzdc\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.654854 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071b3286-64c8-4945-952e-3ba22f94e118-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.659792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b75f147-9726-4336-8467-932ad4ff15f1-combined-ca-bundle\") pod \"neutron-db-sync-vhgdf\" (UID: \"7b75f147-9726-4336-8467-932ad4ff15f1\") " pod="openstack/neutron-db-sync-vhgdf" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.667624 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b75f147-9726-4336-8467-932ad4ff15f1-config\") pod \"neutron-db-sync-vhgdf\" (UID: \"7b75f147-9726-4336-8467-932ad4ff15f1\") " pod="openstack/neutron-db-sync-vhgdf" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.670623 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txvs5\" (UniqueName: \"kubernetes.io/projected/7b75f147-9726-4336-8467-932ad4ff15f1-kube-api-access-txvs5\") pod \"neutron-db-sync-vhgdf\" (UID: \"7b75f147-9726-4336-8467-932ad4ff15f1\") " pod="openstack/neutron-db-sync-vhgdf" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.879238 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vhgdf" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.928062 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dkmsh"] Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.962306 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dkmsh" event={"ID":"44b571b7-d584-46bf-823a-bf8ce35c8dac","Type":"ContainerStarted","Data":"5f6aa17b485253db85ce363bb590ee5d8113fe180475145e1b87369522c5cf22"} Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.969322 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7qnvc" Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.969318 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7qnvc" event={"ID":"071b3286-64c8-4945-952e-3ba22f94e118","Type":"ContainerDied","Data":"ad681b09edf009091672dfdae7e30046357822953cfd1b9ed996989a138e6208"} Oct 01 16:19:06 crc kubenswrapper[4764]: I1001 16:19:06.969369 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad681b09edf009091672dfdae7e30046357822953cfd1b9ed996989a138e6208" Oct 01 16:19:07 crc kubenswrapper[4764]: W1001 16:19:07.059014 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc087a812_124b_496c_afe6_b8ba3ca79ada.slice/crio-a1d230b39e26ef527b106306a82ec292c4c7395a06149679abe2baf997a0b2f8 WatchSource:0}: Error finding container a1d230b39e26ef527b106306a82ec292c4c7395a06149679abe2baf997a0b2f8: Status 404 returned error can't find the container with id a1d230b39e26ef527b106306a82ec292c4c7395a06149679abe2baf997a0b2f8 Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.060310 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7hhkr"] Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.315122 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-wswz5"] Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.315355 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" podUID="cf6e5841-6e54-4901-8504-9fde292244ab" containerName="dnsmasq-dns" containerID="cri-o://666984ffa11dbb84c93f93b9c576e7ea9fd99700dfed10bfb73669d09b5494ad" gracePeriod=10 Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.347087 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-7gc4w"] Oct 01 16:19:07 crc kubenswrapper[4764]: E1001 16:19:07.348815 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071b3286-64c8-4945-952e-3ba22f94e118" containerName="glance-db-sync" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.348843 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="071b3286-64c8-4945-952e-3ba22f94e118" containerName="glance-db-sync" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.349094 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="071b3286-64c8-4945-952e-3ba22f94e118" containerName="glance-db-sync" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.355413 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.368469 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vhgdf"] Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.390727 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-7gc4w"] Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.468603 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-config\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.468683 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz695\" (UniqueName: \"kubernetes.io/projected/ef4b073a-e109-47ee-9bd2-45c92b329310-kube-api-access-bz695\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.468716 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.468768 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.468792 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.570280 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.570326 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.570368 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-config\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.570417 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz695\" (UniqueName: \"kubernetes.io/projected/ef4b073a-e109-47ee-9bd2-45c92b329310-kube-api-access-bz695\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.570444 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.573109 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.573531 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.573579 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.573997 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-config\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.595677 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz695\" (UniqueName: \"kubernetes.io/projected/ef4b073a-e109-47ee-9bd2-45c92b329310-kube-api-access-bz695\") pod \"dnsmasq-dns-7987f74bbc-7gc4w\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.599504 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.671425 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-config-data\") pod \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.671594 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-credential-keys\") pod \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.671631 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-combined-ca-bundle\") pod \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.671710 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5xfj\" (UniqueName: \"kubernetes.io/projected/23db45bf-66ff-4806-8e6d-9d4e836bcb89-kube-api-access-w5xfj\") pod \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.671769 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-fernet-keys\") pod \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.671801 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-scripts\") pod \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\" (UID: \"23db45bf-66ff-4806-8e6d-9d4e836bcb89\") " Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.676262 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23db45bf-66ff-4806-8e6d-9d4e836bcb89-kube-api-access-w5xfj" (OuterVolumeSpecName: "kube-api-access-w5xfj") pod "23db45bf-66ff-4806-8e6d-9d4e836bcb89" (UID: "23db45bf-66ff-4806-8e6d-9d4e836bcb89"). InnerVolumeSpecName "kube-api-access-w5xfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.676998 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "23db45bf-66ff-4806-8e6d-9d4e836bcb89" (UID: "23db45bf-66ff-4806-8e6d-9d4e836bcb89"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.677746 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-scripts" (OuterVolumeSpecName: "scripts") pod "23db45bf-66ff-4806-8e6d-9d4e836bcb89" (UID: "23db45bf-66ff-4806-8e6d-9d4e836bcb89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.678402 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "23db45bf-66ff-4806-8e6d-9d4e836bcb89" (UID: "23db45bf-66ff-4806-8e6d-9d4e836bcb89"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.704076 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23db45bf-66ff-4806-8e6d-9d4e836bcb89" (UID: "23db45bf-66ff-4806-8e6d-9d4e836bcb89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.706759 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-config-data" (OuterVolumeSpecName: "config-data") pod "23db45bf-66ff-4806-8e6d-9d4e836bcb89" (UID: "23db45bf-66ff-4806-8e6d-9d4e836bcb89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.773809 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5xfj\" (UniqueName: \"kubernetes.io/projected/23db45bf-66ff-4806-8e6d-9d4e836bcb89-kube-api-access-w5xfj\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.774185 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.774195 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.774202 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.774211 4764 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.774220 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23db45bf-66ff-4806-8e6d-9d4e836bcb89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.891226 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.979964 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lrgx9" event={"ID":"23db45bf-66ff-4806-8e6d-9d4e836bcb89","Type":"ContainerDied","Data":"56ef9c5937f4849c4fc3f1c577f28ed808dec5dd75ddc201a5ec0b714ac6cfba"} Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.980602 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56ef9c5937f4849c4fc3f1c577f28ed808dec5dd75ddc201a5ec0b714ac6cfba" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.980650 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lrgx9" Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.985026 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf6e5841-6e54-4901-8504-9fde292244ab" containerID="666984ffa11dbb84c93f93b9c576e7ea9fd99700dfed10bfb73669d09b5494ad" exitCode=0 Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.985089 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" event={"ID":"cf6e5841-6e54-4901-8504-9fde292244ab","Type":"ContainerDied","Data":"666984ffa11dbb84c93f93b9c576e7ea9fd99700dfed10bfb73669d09b5494ad"} Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.995206 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vhgdf" event={"ID":"7b75f147-9726-4336-8467-932ad4ff15f1","Type":"ContainerStarted","Data":"6b83c48e9be8d4f73fd5389c6ae466fdc26fe6d5e3248013ac8a3df774e9504f"} Oct 01 16:19:07 crc kubenswrapper[4764]: I1001 16:19:07.995250 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vhgdf" event={"ID":"7b75f147-9726-4336-8467-932ad4ff15f1","Type":"ContainerStarted","Data":"db7cbaebf1041c422fce4bcfe4c36a0137d240d7047d0707bee631d8304ead01"} Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.006843 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7hhkr" event={"ID":"c087a812-124b-496c-afe6-b8ba3ca79ada","Type":"ContainerStarted","Data":"a1d230b39e26ef527b106306a82ec292c4c7395a06149679abe2baf997a0b2f8"} Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.026544 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vhgdf" podStartSLOduration=2.026524859 podStartE2EDuration="2.026524859s" podCreationTimestamp="2025-10-01 16:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:19:08.020771587 +0000 UTC m=+1011.020418432" watchObservedRunningTime="2025-10-01 16:19:08.026524859 +0000 UTC m=+1011.026171704" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.140959 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lrgx9"] Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.149263 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lrgx9"] Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.230227 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-d9nkl"] Oct 01 16:19:08 crc kubenswrapper[4764]: E1001 16:19:08.230535 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23db45bf-66ff-4806-8e6d-9d4e836bcb89" containerName="keystone-bootstrap" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.230553 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="23db45bf-66ff-4806-8e6d-9d4e836bcb89" containerName="keystone-bootstrap" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.231202 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="23db45bf-66ff-4806-8e6d-9d4e836bcb89" containerName="keystone-bootstrap" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.231693 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.236838 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.237032 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9lshv" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.237163 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.237265 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.273167 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d9nkl"] Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.292535 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-combined-ca-bundle\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.292630 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rjfc\" (UniqueName: \"kubernetes.io/projected/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-kube-api-access-9rjfc\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.292711 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-scripts\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.292833 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-fernet-keys\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.292931 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-config-data\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.293056 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-credential-keys\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.397073 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rjfc\" (UniqueName: \"kubernetes.io/projected/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-kube-api-access-9rjfc\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.397345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-scripts\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.397378 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-fernet-keys\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.397413 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-config-data\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.397466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-credential-keys\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.397519 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-combined-ca-bundle\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.403087 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-config-data\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.403092 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-scripts\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.403751 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-fernet-keys\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.404306 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-combined-ca-bundle\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.411216 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-credential-keys\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.412248 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rjfc\" (UniqueName: \"kubernetes.io/projected/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-kube-api-access-9rjfc\") pod \"keystone-bootstrap-d9nkl\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:08 crc kubenswrapper[4764]: I1001 16:19:08.574181 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:09 crc kubenswrapper[4764]: I1001 16:19:09.732958 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23db45bf-66ff-4806-8e6d-9d4e836bcb89" path="/var/lib/kubelet/pods/23db45bf-66ff-4806-8e6d-9d4e836bcb89/volumes" Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.561293 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.637321 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4dfb\" (UniqueName: \"kubernetes.io/projected/cf6e5841-6e54-4901-8504-9fde292244ab-kube-api-access-d4dfb\") pod \"cf6e5841-6e54-4901-8504-9fde292244ab\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.637426 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-ovsdbserver-nb\") pod \"cf6e5841-6e54-4901-8504-9fde292244ab\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.637469 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-config\") pod \"cf6e5841-6e54-4901-8504-9fde292244ab\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.637524 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-ovsdbserver-sb\") pod \"cf6e5841-6e54-4901-8504-9fde292244ab\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.637567 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-dns-svc\") pod \"cf6e5841-6e54-4901-8504-9fde292244ab\" (UID: \"cf6e5841-6e54-4901-8504-9fde292244ab\") " Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.650016 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf6e5841-6e54-4901-8504-9fde292244ab-kube-api-access-d4dfb" (OuterVolumeSpecName: "kube-api-access-d4dfb") pod "cf6e5841-6e54-4901-8504-9fde292244ab" (UID: "cf6e5841-6e54-4901-8504-9fde292244ab"). InnerVolumeSpecName "kube-api-access-d4dfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.695466 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf6e5841-6e54-4901-8504-9fde292244ab" (UID: "cf6e5841-6e54-4901-8504-9fde292244ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.706674 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf6e5841-6e54-4901-8504-9fde292244ab" (UID: "cf6e5841-6e54-4901-8504-9fde292244ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.710992 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-config" (OuterVolumeSpecName: "config") pod "cf6e5841-6e54-4901-8504-9fde292244ab" (UID: "cf6e5841-6e54-4901-8504-9fde292244ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.733806 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf6e5841-6e54-4901-8504-9fde292244ab" (UID: "cf6e5841-6e54-4901-8504-9fde292244ab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.739385 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4dfb\" (UniqueName: \"kubernetes.io/projected/cf6e5841-6e54-4901-8504-9fde292244ab-kube-api-access-d4dfb\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.739413 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.739428 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.739436 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:10 crc kubenswrapper[4764]: I1001 16:19:10.739445 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf6e5841-6e54-4901-8504-9fde292244ab-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:11 crc kubenswrapper[4764]: I1001 16:19:11.036428 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" event={"ID":"cf6e5841-6e54-4901-8504-9fde292244ab","Type":"ContainerDied","Data":"9a3b458912b9f502cab1bb5a7b16df7eb0328ff509cccefcd3ffcc2e48fc87df"} Oct 01 16:19:11 crc kubenswrapper[4764]: I1001 16:19:11.036485 4764 scope.go:117] "RemoveContainer" containerID="666984ffa11dbb84c93f93b9c576e7ea9fd99700dfed10bfb73669d09b5494ad" Oct 01 16:19:11 crc kubenswrapper[4764]: I1001 16:19:11.036612 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-wswz5" Oct 01 16:19:11 crc kubenswrapper[4764]: I1001 16:19:11.066238 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-wswz5"] Oct 01 16:19:11 crc kubenswrapper[4764]: I1001 16:19:11.071611 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-wswz5"] Oct 01 16:19:11 crc kubenswrapper[4764]: I1001 16:19:11.732779 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf6e5841-6e54-4901-8504-9fde292244ab" path="/var/lib/kubelet/pods/cf6e5841-6e54-4901-8504-9fde292244ab/volumes" Oct 01 16:19:16 crc kubenswrapper[4764]: I1001 16:19:16.176550 4764 scope.go:117] "RemoveContainer" containerID="ddf6b5178520dec74392eb768af8a06f199a811b0bfef63afaecc85c2d8ea291" Oct 01 16:19:16 crc kubenswrapper[4764]: I1001 16:19:16.640817 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-7gc4w"] Oct 01 16:19:16 crc kubenswrapper[4764]: W1001 16:19:16.645760 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef4b073a_e109_47ee_9bd2_45c92b329310.slice/crio-419f7ac0d09514c45b9c26444bb84a0161ea6e0a31b8b5b3fba0b83f707f081f WatchSource:0}: Error finding container 419f7ac0d09514c45b9c26444bb84a0161ea6e0a31b8b5b3fba0b83f707f081f: Status 404 returned error can't find the container with id 419f7ac0d09514c45b9c26444bb84a0161ea6e0a31b8b5b3fba0b83f707f081f Oct 01 16:19:16 crc kubenswrapper[4764]: I1001 16:19:16.711006 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d9nkl"] Oct 01 16:19:16 crc kubenswrapper[4764]: W1001 16:19:16.718087 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f26797c_24fd_4c0c_bf6e_5cb3e53c898d.slice/crio-287af49fb5f20b63c0ac4a353683966eacede8ba097d487c081dbb4516511118 WatchSource:0}: Error finding container 287af49fb5f20b63c0ac4a353683966eacede8ba097d487c081dbb4516511118: Status 404 returned error can't find the container with id 287af49fb5f20b63c0ac4a353683966eacede8ba097d487c081dbb4516511118 Oct 01 16:19:17 crc kubenswrapper[4764]: I1001 16:19:17.091889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7hhkr" event={"ID":"c087a812-124b-496c-afe6-b8ba3ca79ada","Type":"ContainerStarted","Data":"a0de567bbcc81cba6f96918fc7f6231c862b2e7b0b3e0b4940bc97e6ab986e63"} Oct 01 16:19:17 crc kubenswrapper[4764]: I1001 16:19:17.094204 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n8qfb" event={"ID":"0c44de1b-4886-4c2b-a57c-a234a882e4a6","Type":"ContainerStarted","Data":"381f240de68976a99439117182e6687d000f39f56f744fb172deecd5f3bcdd24"} Oct 01 16:19:17 crc kubenswrapper[4764]: I1001 16:19:17.096868 4764 generic.go:334] "Generic (PLEG): container finished" podID="ef4b073a-e109-47ee-9bd2-45c92b329310" containerID="26df21f29710caec04d2e507648077fd343797764f63abe76c972969364d13e5" exitCode=0 Oct 01 16:19:17 crc kubenswrapper[4764]: I1001 16:19:17.096943 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" event={"ID":"ef4b073a-e109-47ee-9bd2-45c92b329310","Type":"ContainerDied","Data":"26df21f29710caec04d2e507648077fd343797764f63abe76c972969364d13e5"} Oct 01 16:19:17 crc kubenswrapper[4764]: I1001 16:19:17.096975 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" event={"ID":"ef4b073a-e109-47ee-9bd2-45c92b329310","Type":"ContainerStarted","Data":"419f7ac0d09514c45b9c26444bb84a0161ea6e0a31b8b5b3fba0b83f707f081f"} Oct 01 16:19:17 crc kubenswrapper[4764]: I1001 16:19:17.099811 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36","Type":"ContainerStarted","Data":"ee27e14c9fcf9a9cd496a61db0e3400419e3ee493e343739a07e44c3d3d5a12f"} Oct 01 16:19:17 crc kubenswrapper[4764]: I1001 16:19:17.102182 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d9nkl" event={"ID":"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d","Type":"ContainerStarted","Data":"6558f414897181533fe9abd810770c686269aa91cc01be5c407a467c1c81f04b"} Oct 01 16:19:17 crc kubenswrapper[4764]: I1001 16:19:17.102218 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d9nkl" event={"ID":"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d","Type":"ContainerStarted","Data":"287af49fb5f20b63c0ac4a353683966eacede8ba097d487c081dbb4516511118"} Oct 01 16:19:17 crc kubenswrapper[4764]: I1001 16:19:17.116675 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-7hhkr" podStartSLOduration=1.970921098 podStartE2EDuration="11.116657435s" podCreationTimestamp="2025-10-01 16:19:06 +0000 UTC" firstStartedPulling="2025-10-01 16:19:07.063987955 +0000 UTC m=+1010.063634780" lastFinishedPulling="2025-10-01 16:19:16.209724232 +0000 UTC m=+1019.209371117" observedRunningTime="2025-10-01 16:19:17.106860863 +0000 UTC m=+1020.106507698" watchObservedRunningTime="2025-10-01 16:19:17.116657435 +0000 UTC m=+1020.116304270" Oct 01 16:19:17 crc kubenswrapper[4764]: I1001 16:19:17.166131 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-d9nkl" podStartSLOduration=9.166111055 podStartE2EDuration="9.166111055s" podCreationTimestamp="2025-10-01 16:19:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:19:17.159758988 +0000 UTC m=+1020.159405833" watchObservedRunningTime="2025-10-01 16:19:17.166111055 +0000 UTC m=+1020.165757880" Oct 01 16:19:17 crc kubenswrapper[4764]: I1001 16:19:17.182789 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-n8qfb" podStartSLOduration=2.405842732 podStartE2EDuration="17.182765816s" podCreationTimestamp="2025-10-01 16:19:00 +0000 UTC" firstStartedPulling="2025-10-01 16:19:01.414341112 +0000 UTC m=+1004.413987947" lastFinishedPulling="2025-10-01 16:19:16.191264156 +0000 UTC m=+1019.190911031" observedRunningTime="2025-10-01 16:19:17.174868491 +0000 UTC m=+1020.174515336" watchObservedRunningTime="2025-10-01 16:19:17.182765816 +0000 UTC m=+1020.182412651" Oct 01 16:19:18 crc kubenswrapper[4764]: I1001 16:19:18.130423 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" event={"ID":"ef4b073a-e109-47ee-9bd2-45c92b329310","Type":"ContainerStarted","Data":"53017b083413fbaf0018f97d432b1f5f992bdfd0e91b209546c7e3d231e64187"} Oct 01 16:19:18 crc kubenswrapper[4764]: I1001 16:19:18.130965 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:18 crc kubenswrapper[4764]: I1001 16:19:18.156177 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" podStartSLOduration=11.156145178 podStartE2EDuration="11.156145178s" podCreationTimestamp="2025-10-01 16:19:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:19:18.151177125 +0000 UTC m=+1021.150823970" watchObservedRunningTime="2025-10-01 16:19:18.156145178 +0000 UTC m=+1021.155792043" Oct 01 16:19:19 crc kubenswrapper[4764]: I1001 16:19:19.141750 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36","Type":"ContainerStarted","Data":"a292679557bd3aca731b4ecc0b02840fb106ac5e6d720ca8216216439614a98e"} Oct 01 16:19:20 crc kubenswrapper[4764]: I1001 16:19:20.154900 4764 generic.go:334] "Generic (PLEG): container finished" podID="7f26797c-24fd-4c0c-bf6e-5cb3e53c898d" containerID="6558f414897181533fe9abd810770c686269aa91cc01be5c407a467c1c81f04b" exitCode=0 Oct 01 16:19:20 crc kubenswrapper[4764]: I1001 16:19:20.155001 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d9nkl" event={"ID":"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d","Type":"ContainerDied","Data":"6558f414897181533fe9abd810770c686269aa91cc01be5c407a467c1c81f04b"} Oct 01 16:19:21 crc kubenswrapper[4764]: I1001 16:19:21.914413 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:19:21 crc kubenswrapper[4764]: I1001 16:19:21.914726 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:19:22 crc kubenswrapper[4764]: I1001 16:19:22.895248 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:22 crc kubenswrapper[4764]: I1001 16:19:22.977675 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-5tgg8"] Oct 01 16:19:22 crc kubenswrapper[4764]: I1001 16:19:22.977922 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" podUID="1f5264d7-844c-4394-916a-efaed2507401" containerName="dnsmasq-dns" containerID="cri-o://d0cae7baa939d9e85671dd02e05b486f90b972e32f73e3e1ead09eda1322f2cc" gracePeriod=10 Oct 01 16:19:24 crc kubenswrapper[4764]: I1001 16:19:24.196914 4764 generic.go:334] "Generic (PLEG): container finished" podID="1f5264d7-844c-4394-916a-efaed2507401" containerID="d0cae7baa939d9e85671dd02e05b486f90b972e32f73e3e1ead09eda1322f2cc" exitCode=0 Oct 01 16:19:24 crc kubenswrapper[4764]: I1001 16:19:24.196966 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" event={"ID":"1f5264d7-844c-4394-916a-efaed2507401","Type":"ContainerDied","Data":"d0cae7baa939d9e85671dd02e05b486f90b972e32f73e3e1ead09eda1322f2cc"} Oct 01 16:19:27 crc kubenswrapper[4764]: I1001 16:19:27.783405 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" podUID="1f5264d7-844c-4394-916a-efaed2507401" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Oct 01 16:19:28 crc kubenswrapper[4764]: I1001 16:19:28.240836 4764 generic.go:334] "Generic (PLEG): container finished" podID="0c44de1b-4886-4c2b-a57c-a234a882e4a6" containerID="381f240de68976a99439117182e6687d000f39f56f744fb172deecd5f3bcdd24" exitCode=0 Oct 01 16:19:28 crc kubenswrapper[4764]: I1001 16:19:28.240890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n8qfb" event={"ID":"0c44de1b-4886-4c2b-a57c-a234a882e4a6","Type":"ContainerDied","Data":"381f240de68976a99439117182e6687d000f39f56f744fb172deecd5f3bcdd24"} Oct 01 16:19:30 crc kubenswrapper[4764]: I1001 16:19:30.272292 4764 generic.go:334] "Generic (PLEG): container finished" podID="c087a812-124b-496c-afe6-b8ba3ca79ada" containerID="a0de567bbcc81cba6f96918fc7f6231c862b2e7b0b3e0b4940bc97e6ab986e63" exitCode=0 Oct 01 16:19:30 crc kubenswrapper[4764]: I1001 16:19:30.272537 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7hhkr" event={"ID":"c087a812-124b-496c-afe6-b8ba3ca79ada","Type":"ContainerDied","Data":"a0de567bbcc81cba6f96918fc7f6231c862b2e7b0b3e0b4940bc97e6ab986e63"} Oct 01 16:19:32 crc kubenswrapper[4764]: I1001 16:19:32.782983 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" podUID="1f5264d7-844c-4394-916a-efaed2507401" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Oct 01 16:19:36 crc kubenswrapper[4764]: I1001 16:19:36.540991 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7hhkr" Oct 01 16:19:36 crc kubenswrapper[4764]: I1001 16:19:36.562301 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c087a812-124b-496c-afe6-b8ba3ca79ada-db-sync-config-data\") pod \"c087a812-124b-496c-afe6-b8ba3ca79ada\" (UID: \"c087a812-124b-496c-afe6-b8ba3ca79ada\") " Oct 01 16:19:36 crc kubenswrapper[4764]: I1001 16:19:36.562389 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c087a812-124b-496c-afe6-b8ba3ca79ada-combined-ca-bundle\") pod \"c087a812-124b-496c-afe6-b8ba3ca79ada\" (UID: \"c087a812-124b-496c-afe6-b8ba3ca79ada\") " Oct 01 16:19:36 crc kubenswrapper[4764]: I1001 16:19:36.562507 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krsbn\" (UniqueName: \"kubernetes.io/projected/c087a812-124b-496c-afe6-b8ba3ca79ada-kube-api-access-krsbn\") pod \"c087a812-124b-496c-afe6-b8ba3ca79ada\" (UID: \"c087a812-124b-496c-afe6-b8ba3ca79ada\") " Oct 01 16:19:36 crc kubenswrapper[4764]: I1001 16:19:36.573020 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c087a812-124b-496c-afe6-b8ba3ca79ada-kube-api-access-krsbn" (OuterVolumeSpecName: "kube-api-access-krsbn") pod "c087a812-124b-496c-afe6-b8ba3ca79ada" (UID: "c087a812-124b-496c-afe6-b8ba3ca79ada"). InnerVolumeSpecName "kube-api-access-krsbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:36 crc kubenswrapper[4764]: I1001 16:19:36.577212 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c087a812-124b-496c-afe6-b8ba3ca79ada-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c087a812-124b-496c-afe6-b8ba3ca79ada" (UID: "c087a812-124b-496c-afe6-b8ba3ca79ada"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:36 crc kubenswrapper[4764]: I1001 16:19:36.612246 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c087a812-124b-496c-afe6-b8ba3ca79ada-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c087a812-124b-496c-afe6-b8ba3ca79ada" (UID: "c087a812-124b-496c-afe6-b8ba3ca79ada"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:36 crc kubenswrapper[4764]: I1001 16:19:36.665558 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c087a812-124b-496c-afe6-b8ba3ca79ada-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:36 crc kubenswrapper[4764]: I1001 16:19:36.665614 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c087a812-124b-496c-afe6-b8ba3ca79ada-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:36 crc kubenswrapper[4764]: I1001 16:19:36.665635 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krsbn\" (UniqueName: \"kubernetes.io/projected/c087a812-124b-496c-afe6-b8ba3ca79ada-kube-api-access-krsbn\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.332317 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7hhkr" event={"ID":"c087a812-124b-496c-afe6-b8ba3ca79ada","Type":"ContainerDied","Data":"a1d230b39e26ef527b106306a82ec292c4c7395a06149679abe2baf997a0b2f8"} Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.332359 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1d230b39e26ef527b106306a82ec292c4c7395a06149679abe2baf997a0b2f8" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.332374 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7hhkr" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.782789 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" podUID="1f5264d7-844c-4394-916a-efaed2507401" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: connect: connection refused" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.783187 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:19:37 crc kubenswrapper[4764]: E1001 16:19:37.846425 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 01 16:19:37 crc kubenswrapper[4764]: E1001 16:19:37.846581 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bd7s7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-dkmsh_openstack(44b571b7-d584-46bf-823a-bf8ce35c8dac): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 16:19:37 crc kubenswrapper[4764]: E1001 16:19:37.848622 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-dkmsh" podUID="44b571b7-d584-46bf-823a-bf8ce35c8dac" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.869372 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-557bf5c9c4-gn9s8"] Oct 01 16:19:37 crc kubenswrapper[4764]: E1001 16:19:37.869750 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6e5841-6e54-4901-8504-9fde292244ab" containerName="dnsmasq-dns" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.869770 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6e5841-6e54-4901-8504-9fde292244ab" containerName="dnsmasq-dns" Oct 01 16:19:37 crc kubenswrapper[4764]: E1001 16:19:37.869798 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c087a812-124b-496c-afe6-b8ba3ca79ada" containerName="barbican-db-sync" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.869804 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c087a812-124b-496c-afe6-b8ba3ca79ada" containerName="barbican-db-sync" Oct 01 16:19:37 crc kubenswrapper[4764]: E1001 16:19:37.869822 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6e5841-6e54-4901-8504-9fde292244ab" containerName="init" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.869828 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6e5841-6e54-4901-8504-9fde292244ab" containerName="init" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.869988 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf6e5841-6e54-4901-8504-9fde292244ab" containerName="dnsmasq-dns" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.870002 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c087a812-124b-496c-afe6-b8ba3ca79ada" containerName="barbican-db-sync" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.870951 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.876723 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xp6sm" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.876932 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.877071 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.882639 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-56949f8bfc-chjrr"] Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.883968 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.885551 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.891288 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcx6x\" (UniqueName: \"kubernetes.io/projected/446019eb-78e7-4c76-983b-44a968141080-kube-api-access-rcx6x\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.891379 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/446019eb-78e7-4c76-983b-44a968141080-config-data-custom\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.891414 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446019eb-78e7-4c76-983b-44a968141080-logs\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.891433 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446019eb-78e7-4c76-983b-44a968141080-combined-ca-bundle\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.891468 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/446019eb-78e7-4c76-983b-44a968141080-config-data\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.914132 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-557bf5c9c4-gn9s8"] Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.927903 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56949f8bfc-chjrr"] Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.937366 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.992575 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-combined-ca-bundle\") pod \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.992675 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-scripts\") pod \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.992716 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-config-data\") pod \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.992773 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-credential-keys\") pod \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.992801 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rjfc\" (UniqueName: \"kubernetes.io/projected/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-kube-api-access-9rjfc\") pod \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.992901 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-fernet-keys\") pod \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\" (UID: \"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d\") " Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.993101 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446019eb-78e7-4c76-983b-44a968141080-combined-ca-bundle\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.993130 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4930a41-989a-4747-b659-f35df5f73bd0-config-data\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.993170 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/446019eb-78e7-4c76-983b-44a968141080-config-data\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.993200 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcx6x\" (UniqueName: \"kubernetes.io/projected/446019eb-78e7-4c76-983b-44a968141080-kube-api-access-rcx6x\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.993234 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnp9x\" (UniqueName: \"kubernetes.io/projected/f4930a41-989a-4747-b659-f35df5f73bd0-kube-api-access-bnp9x\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.993292 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4930a41-989a-4747-b659-f35df5f73bd0-logs\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.993310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/446019eb-78e7-4c76-983b-44a968141080-config-data-custom\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.993330 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4930a41-989a-4747-b659-f35df5f73bd0-config-data-custom\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.993348 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4930a41-989a-4747-b659-f35df5f73bd0-combined-ca-bundle\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.993373 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446019eb-78e7-4c76-983b-44a968141080-logs\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.993871 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446019eb-78e7-4c76-983b-44a968141080-logs\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:37 crc kubenswrapper[4764]: I1001 16:19:37.996855 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.001405 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446019eb-78e7-4c76-983b-44a968141080-combined-ca-bundle\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.003959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/446019eb-78e7-4c76-983b-44a968141080-config-data\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.005549 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7f26797c-24fd-4c0c-bf6e-5cb3e53c898d" (UID: "7f26797c-24fd-4c0c-bf6e-5cb3e53c898d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.013735 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/446019eb-78e7-4c76-983b-44a968141080-config-data-custom\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.020917 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7f26797c-24fd-4c0c-bf6e-5cb3e53c898d" (UID: "7f26797c-24fd-4c0c-bf6e-5cb3e53c898d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.023643 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-kube-api-access-9rjfc" (OuterVolumeSpecName: "kube-api-access-9rjfc") pod "7f26797c-24fd-4c0c-bf6e-5cb3e53c898d" (UID: "7f26797c-24fd-4c0c-bf6e-5cb3e53c898d"). InnerVolumeSpecName "kube-api-access-9rjfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.025115 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-scripts" (OuterVolumeSpecName: "scripts") pod "7f26797c-24fd-4c0c-bf6e-5cb3e53c898d" (UID: "7f26797c-24fd-4c0c-bf6e-5cb3e53c898d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.048784 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-699df9757c-rtm8g"] Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.052625 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcx6x\" (UniqueName: \"kubernetes.io/projected/446019eb-78e7-4c76-983b-44a968141080-kube-api-access-rcx6x\") pod \"barbican-keystone-listener-557bf5c9c4-gn9s8\" (UID: \"446019eb-78e7-4c76-983b-44a968141080\") " pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:38 crc kubenswrapper[4764]: E1001 16:19:38.053284 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c44de1b-4886-4c2b-a57c-a234a882e4a6" containerName="placement-db-sync" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.053313 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c44de1b-4886-4c2b-a57c-a234a882e4a6" containerName="placement-db-sync" Oct 01 16:19:38 crc kubenswrapper[4764]: E1001 16:19:38.053340 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f26797c-24fd-4c0c-bf6e-5cb3e53c898d" containerName="keystone-bootstrap" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.053349 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f26797c-24fd-4c0c-bf6e-5cb3e53c898d" containerName="keystone-bootstrap" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.053586 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f26797c-24fd-4c0c-bf6e-5cb3e53c898d" containerName="keystone-bootstrap" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.053598 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c44de1b-4886-4c2b-a57c-a234a882e4a6" containerName="placement-db-sync" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.054516 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.070227 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f26797c-24fd-4c0c-bf6e-5cb3e53c898d" (UID: "7f26797c-24fd-4c0c-bf6e-5cb3e53c898d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.099207 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-config-data\") pod \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.099475 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c44de1b-4886-4c2b-a57c-a234a882e4a6-logs\") pod \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.099543 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdmn6\" (UniqueName: \"kubernetes.io/projected/0c44de1b-4886-4c2b-a57c-a234a882e4a6-kube-api-access-fdmn6\") pod \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.099589 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-scripts\") pod \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.099663 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-combined-ca-bundle\") pod \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\" (UID: \"0c44de1b-4886-4c2b-a57c-a234a882e4a6\") " Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.099904 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzzm9\" (UniqueName: \"kubernetes.io/projected/e8d40a39-8de0-4a53-b329-0d06086d3f7b-kube-api-access-qzzm9\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.099933 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnp9x\" (UniqueName: \"kubernetes.io/projected/f4930a41-989a-4747-b659-f35df5f73bd0-kube-api-access-bnp9x\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.099949 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-dns-svc\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.099965 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.099987 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.100056 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4930a41-989a-4747-b659-f35df5f73bd0-logs\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.100086 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4930a41-989a-4747-b659-f35df5f73bd0-config-data-custom\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.100108 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4930a41-989a-4747-b659-f35df5f73bd0-combined-ca-bundle\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.100136 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-config\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.100174 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4930a41-989a-4747-b659-f35df5f73bd0-config-data\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.100845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c44de1b-4886-4c2b-a57c-a234a882e4a6-logs" (OuterVolumeSpecName: "logs") pod "0c44de1b-4886-4c2b-a57c-a234a882e4a6" (UID: "0c44de1b-4886-4c2b-a57c-a234a882e4a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.102141 4764 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.102186 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rjfc\" (UniqueName: \"kubernetes.io/projected/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-kube-api-access-9rjfc\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.102198 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.102208 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.102217 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.102554 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4930a41-989a-4747-b659-f35df5f73bd0-logs\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.107349 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4930a41-989a-4747-b659-f35df5f73bd0-combined-ca-bundle\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.117546 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4930a41-989a-4747-b659-f35df5f73bd0-config-data-custom\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.120236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-scripts" (OuterVolumeSpecName: "scripts") pod "0c44de1b-4886-4c2b-a57c-a234a882e4a6" (UID: "0c44de1b-4886-4c2b-a57c-a234a882e4a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.130928 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-config-data" (OuterVolumeSpecName: "config-data") pod "7f26797c-24fd-4c0c-bf6e-5cb3e53c898d" (UID: "7f26797c-24fd-4c0c-bf6e-5cb3e53c898d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.135747 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-rtm8g"] Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.141242 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c44de1b-4886-4c2b-a57c-a234a882e4a6-kube-api-access-fdmn6" (OuterVolumeSpecName: "kube-api-access-fdmn6") pod "0c44de1b-4886-4c2b-a57c-a234a882e4a6" (UID: "0c44de1b-4886-4c2b-a57c-a234a882e4a6"). InnerVolumeSpecName "kube-api-access-fdmn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.141631 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnp9x\" (UniqueName: \"kubernetes.io/projected/f4930a41-989a-4747-b659-f35df5f73bd0-kube-api-access-bnp9x\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.155330 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4930a41-989a-4747-b659-f35df5f73bd0-config-data\") pod \"barbican-worker-56949f8bfc-chjrr\" (UID: \"f4930a41-989a-4747-b659-f35df5f73bd0\") " pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.176399 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c44de1b-4886-4c2b-a57c-a234a882e4a6" (UID: "0c44de1b-4886-4c2b-a57c-a234a882e4a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.183770 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b8cc5d5b6-6pwzq"] Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.187383 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.190362 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206251 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-config-data-custom\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206316 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-config-data\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzzm9\" (UniqueName: \"kubernetes.io/projected/e8d40a39-8de0-4a53-b329-0d06086d3f7b-kube-api-access-qzzm9\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206390 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-dns-svc\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206414 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206441 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bblc\" (UniqueName: \"kubernetes.io/projected/89f5b048-56d4-433e-8be7-899ca92803c0-kube-api-access-5bblc\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206467 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206506 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89f5b048-56d4-433e-8be7-899ca92803c0-logs\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206583 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-config\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206605 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-combined-ca-bundle\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206660 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c44de1b-4886-4c2b-a57c-a234a882e4a6-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206672 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdmn6\" (UniqueName: \"kubernetes.io/projected/0c44de1b-4886-4c2b-a57c-a234a882e4a6-kube-api-access-fdmn6\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206682 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206690 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.206700 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.218232 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-ovsdbserver-sb\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.223089 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-config\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.225019 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b8cc5d5b6-6pwzq"] Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.226986 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-config-data" (OuterVolumeSpecName: "config-data") pod "0c44de1b-4886-4c2b-a57c-a234a882e4a6" (UID: "0c44de1b-4886-4c2b-a57c-a234a882e4a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.227328 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-ovsdbserver-nb\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.227452 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-dns-svc\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.236670 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzzm9\" (UniqueName: \"kubernetes.io/projected/e8d40a39-8de0-4a53-b329-0d06086d3f7b-kube-api-access-qzzm9\") pod \"dnsmasq-dns-699df9757c-rtm8g\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.272224 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.307840 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-ovsdbserver-nb\") pod \"1f5264d7-844c-4394-916a-efaed2507401\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.307885 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkf55\" (UniqueName: \"kubernetes.io/projected/1f5264d7-844c-4394-916a-efaed2507401-kube-api-access-xkf55\") pod \"1f5264d7-844c-4394-916a-efaed2507401\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.308006 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-config\") pod \"1f5264d7-844c-4394-916a-efaed2507401\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.308082 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-dns-svc\") pod \"1f5264d7-844c-4394-916a-efaed2507401\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.308168 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-ovsdbserver-sb\") pod \"1f5264d7-844c-4394-916a-efaed2507401\" (UID: \"1f5264d7-844c-4394-916a-efaed2507401\") " Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.308698 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bblc\" (UniqueName: \"kubernetes.io/projected/89f5b048-56d4-433e-8be7-899ca92803c0-kube-api-access-5bblc\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.308746 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89f5b048-56d4-433e-8be7-899ca92803c0-logs\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.308816 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-combined-ca-bundle\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.308854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-config-data-custom\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.308888 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-config-data\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.308965 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c44de1b-4886-4c2b-a57c-a234a882e4a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.313731 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89f5b048-56d4-433e-8be7-899ca92803c0-logs\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.334779 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.340028 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-config-data\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.340687 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bblc\" (UniqueName: \"kubernetes.io/projected/89f5b048-56d4-433e-8be7-899ca92803c0-kube-api-access-5bblc\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.340940 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-combined-ca-bundle\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.341350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-config-data-custom\") pod \"barbican-api-7b8cc5d5b6-6pwzq\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.345949 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56949f8bfc-chjrr" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.368637 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f5264d7-844c-4394-916a-efaed2507401-kube-api-access-xkf55" (OuterVolumeSpecName: "kube-api-access-xkf55") pod "1f5264d7-844c-4394-916a-efaed2507401" (UID: "1f5264d7-844c-4394-916a-efaed2507401"). InnerVolumeSpecName "kube-api-access-xkf55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.369130 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f5264d7-844c-4394-916a-efaed2507401" (UID: "1f5264d7-844c-4394-916a-efaed2507401"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.371205 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n8qfb" event={"ID":"0c44de1b-4886-4c2b-a57c-a234a882e4a6","Type":"ContainerDied","Data":"a669a3716b144b610f189033196756e5e8490da544bdf5f9154bfaa2536c1d91"} Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.371239 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a669a3716b144b610f189033196756e5e8490da544bdf5f9154bfaa2536c1d91" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.371296 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n8qfb" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.377412 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36","Type":"ContainerStarted","Data":"057e2cfb8a39ac039b58d73dd553fa8114c33b1fc6507e405202dba57ec85346"} Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.387071 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f5264d7-844c-4394-916a-efaed2507401" (UID: "1f5264d7-844c-4394-916a-efaed2507401"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.387946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d9nkl" event={"ID":"7f26797c-24fd-4c0c-bf6e-5cb3e53c898d","Type":"ContainerDied","Data":"287af49fb5f20b63c0ac4a353683966eacede8ba097d487c081dbb4516511118"} Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.388028 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d9nkl" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.388229 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="287af49fb5f20b63c0ac4a353683966eacede8ba097d487c081dbb4516511118" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.389812 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" event={"ID":"1f5264d7-844c-4394-916a-efaed2507401","Type":"ContainerDied","Data":"a782ff8caa7867881b889bbbdb3abaacb301f4b280a922e22c51c8e8672ebb87"} Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.389868 4764 scope.go:117] "RemoveContainer" containerID="d0cae7baa939d9e85671dd02e05b486f90b972e32f73e3e1ead09eda1322f2cc" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.390114 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-5tgg8" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.390606 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-config" (OuterVolumeSpecName: "config") pod "1f5264d7-844c-4394-916a-efaed2507401" (UID: "1f5264d7-844c-4394-916a-efaed2507401"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: E1001 16:19:38.391824 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-dkmsh" podUID="44b571b7-d584-46bf-823a-bf8ce35c8dac" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.397358 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f5264d7-844c-4394-916a-efaed2507401" (UID: "1f5264d7-844c-4394-916a-efaed2507401"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.407955 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.412670 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.412690 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.412698 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.412707 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5264d7-844c-4394-916a-efaed2507401-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.412716 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkf55\" (UniqueName: \"kubernetes.io/projected/1f5264d7-844c-4394-916a-efaed2507401-kube-api-access-xkf55\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.413223 4764 scope.go:117] "RemoveContainer" containerID="8ad5566f3cd1337c594d3c8b17a80b6965095025ab1b6de33c33e015c10717ed" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.523192 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.733107 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-5tgg8"] Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.736343 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-5tgg8"] Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.833542 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56949f8bfc-chjrr"] Oct 01 16:19:38 crc kubenswrapper[4764]: W1001 16:19:38.866024 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4930a41_989a_4747_b659_f35df5f73bd0.slice/crio-10e92ff472dee6d54b500fd55b94016d51a395992ea2c7dcada249d010179206 WatchSource:0}: Error finding container 10e92ff472dee6d54b500fd55b94016d51a395992ea2c7dcada249d010179206: Status 404 returned error can't find the container with id 10e92ff472dee6d54b500fd55b94016d51a395992ea2c7dcada249d010179206 Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.919460 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-557bf5c9c4-gn9s8"] Oct 01 16:19:38 crc kubenswrapper[4764]: I1001 16:19:38.970970 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-rtm8g"] Oct 01 16:19:38 crc kubenswrapper[4764]: W1001 16:19:38.990001 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8d40a39_8de0_4a53_b329_0d06086d3f7b.slice/crio-297f344190ccc04a8f0c242be5f9c2d300f4dc7001b0b5742468ae32e5a08091 WatchSource:0}: Error finding container 297f344190ccc04a8f0c242be5f9c2d300f4dc7001b0b5742468ae32e5a08091: Status 404 returned error can't find the container with id 297f344190ccc04a8f0c242be5f9c2d300f4dc7001b0b5742468ae32e5a08091 Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.116416 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b8cc5d5b6-6pwzq"] Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.123329 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-78f7fcb65-9gxk4"] Oct 01 16:19:39 crc kubenswrapper[4764]: E1001 16:19:39.123776 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f5264d7-844c-4394-916a-efaed2507401" containerName="init" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.123800 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f5264d7-844c-4394-916a-efaed2507401" containerName="init" Oct 01 16:19:39 crc kubenswrapper[4764]: E1001 16:19:39.123844 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f5264d7-844c-4394-916a-efaed2507401" containerName="dnsmasq-dns" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.123853 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f5264d7-844c-4394-916a-efaed2507401" containerName="dnsmasq-dns" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.124120 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f5264d7-844c-4394-916a-efaed2507401" containerName="dnsmasq-dns" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.124854 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.129517 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-78f7fcb65-9gxk4"] Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.130624 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.130740 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.130857 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.131070 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.131144 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.131074 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-9lshv" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.170313 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8695dd9c7b-mwdsh"] Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.172155 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.175974 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.176266 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.176413 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9rvt9" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.176546 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.183598 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.194211 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8695dd9c7b-mwdsh"] Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241137 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-internal-tls-certs\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241194 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-scripts\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241235 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-config-data\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241253 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-public-tls-certs\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241285 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-public-tls-certs\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241303 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d43038a4-064b-4ecf-bebf-0f4d6116a839-logs\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241330 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-internal-tls-certs\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241353 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-config-data\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241474 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-combined-ca-bundle\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241523 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-credential-keys\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241554 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-fernet-keys\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241646 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-scripts\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241671 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j625s\" (UniqueName: \"kubernetes.io/projected/d43038a4-064b-4ecf-bebf-0f4d6116a839-kube-api-access-j625s\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-combined-ca-bundle\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.241756 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqrzp\" (UniqueName: \"kubernetes.io/projected/3b093936-cdfc-4f2c-a8a6-86820b145b73-kube-api-access-dqrzp\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343066 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-internal-tls-certs\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343129 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-scripts\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343166 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-config-data\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343196 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-public-tls-certs\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343234 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-public-tls-certs\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343253 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d43038a4-064b-4ecf-bebf-0f4d6116a839-logs\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343278 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-internal-tls-certs\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343298 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-config-data\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343321 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-combined-ca-bundle\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-credential-keys\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343360 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-fernet-keys\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343398 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-scripts\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343419 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j625s\" (UniqueName: \"kubernetes.io/projected/d43038a4-064b-4ecf-bebf-0f4d6116a839-kube-api-access-j625s\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-combined-ca-bundle\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.343468 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqrzp\" (UniqueName: \"kubernetes.io/projected/3b093936-cdfc-4f2c-a8a6-86820b145b73-kube-api-access-dqrzp\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.344145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d43038a4-064b-4ecf-bebf-0f4d6116a839-logs\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.351712 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-scripts\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.353975 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-credential-keys\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.354651 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-internal-tls-certs\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.354827 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-config-data\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.355979 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-fernet-keys\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.356316 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-combined-ca-bundle\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.356847 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-internal-tls-certs\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.357403 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-public-tls-certs\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.361338 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-config-data\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.361849 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b093936-cdfc-4f2c-a8a6-86820b145b73-scripts\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.363066 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqrzp\" (UniqueName: \"kubernetes.io/projected/3b093936-cdfc-4f2c-a8a6-86820b145b73-kube-api-access-dqrzp\") pod \"keystone-78f7fcb65-9gxk4\" (UID: \"3b093936-cdfc-4f2c-a8a6-86820b145b73\") " pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.363359 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j625s\" (UniqueName: \"kubernetes.io/projected/d43038a4-064b-4ecf-bebf-0f4d6116a839-kube-api-access-j625s\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.364298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-combined-ca-bundle\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.364382 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d43038a4-064b-4ecf-bebf-0f4d6116a839-public-tls-certs\") pod \"placement-8695dd9c7b-mwdsh\" (UID: \"d43038a4-064b-4ecf-bebf-0f4d6116a839\") " pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.400844 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56949f8bfc-chjrr" event={"ID":"f4930a41-989a-4747-b659-f35df5f73bd0","Type":"ContainerStarted","Data":"10e92ff472dee6d54b500fd55b94016d51a395992ea2c7dcada249d010179206"} Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.402793 4764 generic.go:334] "Generic (PLEG): container finished" podID="e8d40a39-8de0-4a53-b329-0d06086d3f7b" containerID="309e442ce386e3fc0c223ccb7a4430df52fe55b424acd2184ae125f330cf009e" exitCode=0 Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.402844 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-rtm8g" event={"ID":"e8d40a39-8de0-4a53-b329-0d06086d3f7b","Type":"ContainerDied","Data":"309e442ce386e3fc0c223ccb7a4430df52fe55b424acd2184ae125f330cf009e"} Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.402858 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-rtm8g" event={"ID":"e8d40a39-8de0-4a53-b329-0d06086d3f7b","Type":"ContainerStarted","Data":"297f344190ccc04a8f0c242be5f9c2d300f4dc7001b0b5742468ae32e5a08091"} Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.406970 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" event={"ID":"446019eb-78e7-4c76-983b-44a968141080","Type":"ContainerStarted","Data":"c4a552174f54a8b6df2257b0fff1c580c9959a46a4caf1491d62ad107da42b3d"} Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.409284 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" event={"ID":"89f5b048-56d4-433e-8be7-899ca92803c0","Type":"ContainerStarted","Data":"f5690a0515e0af167e6266fcd08780bf17da3f2d432789175acf137324d679e7"} Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.409388 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" event={"ID":"89f5b048-56d4-433e-8be7-899ca92803c0","Type":"ContainerStarted","Data":"c96b0d20637407de7b2bde95ff995a7a5eddf448cb96192faab61876c57b38f9"} Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.477928 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.499074 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.735360 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f5264d7-844c-4394-916a-efaed2507401" path="/var/lib/kubelet/pods/1f5264d7-844c-4394-916a-efaed2507401/volumes" Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.956007 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8695dd9c7b-mwdsh"] Oct 01 16:19:39 crc kubenswrapper[4764]: I1001 16:19:39.974305 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-78f7fcb65-9gxk4"] Oct 01 16:19:39 crc kubenswrapper[4764]: W1001 16:19:39.982556 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd43038a4_064b_4ecf_bebf_0f4d6116a839.slice/crio-065fc8e188e43f8bc094eb9c193d063b29fa7fde10c9d1ec2a34c1741a9652db WatchSource:0}: Error finding container 065fc8e188e43f8bc094eb9c193d063b29fa7fde10c9d1ec2a34c1741a9652db: Status 404 returned error can't find the container with id 065fc8e188e43f8bc094eb9c193d063b29fa7fde10c9d1ec2a34c1741a9652db Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.421169 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-rtm8g" event={"ID":"e8d40a39-8de0-4a53-b329-0d06086d3f7b","Type":"ContainerStarted","Data":"7d2441b42055c4cb741fcd442fceedc4e4183ca7a06ef17df4cb324725763c5d"} Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.422207 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.427175 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" event={"ID":"89f5b048-56d4-433e-8be7-899ca92803c0","Type":"ContainerStarted","Data":"790fd192ed7fdab21806ebc4f57bd21352d68d0579e641b72cd2e538b04b0473"} Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.427990 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.428069 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.430582 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-78f7fcb65-9gxk4" event={"ID":"3b093936-cdfc-4f2c-a8a6-86820b145b73","Type":"ContainerStarted","Data":"1b66909a500bc736afbc9667b18302909be6e75f1f7f83c1e46d223b7fc30d0b"} Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.430621 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-78f7fcb65-9gxk4" event={"ID":"3b093936-cdfc-4f2c-a8a6-86820b145b73","Type":"ContainerStarted","Data":"a398ad9551f2d89ab67ed8cf25ef18ac60cba6c97767cd1f0a072dd5c0c32687"} Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.431137 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.432993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8695dd9c7b-mwdsh" event={"ID":"d43038a4-064b-4ecf-bebf-0f4d6116a839","Type":"ContainerStarted","Data":"88f7f5e34e9820748ab19a7e9fc871f6201bd0694ca7d9a5fb7639e98399c435"} Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.433023 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8695dd9c7b-mwdsh" event={"ID":"d43038a4-064b-4ecf-bebf-0f4d6116a839","Type":"ContainerStarted","Data":"065fc8e188e43f8bc094eb9c193d063b29fa7fde10c9d1ec2a34c1741a9652db"} Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.449724 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-699df9757c-rtm8g" podStartSLOduration=3.44970511 podStartE2EDuration="3.44970511s" podCreationTimestamp="2025-10-01 16:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:19:40.44118096 +0000 UTC m=+1043.440827805" watchObservedRunningTime="2025-10-01 16:19:40.44970511 +0000 UTC m=+1043.449351945" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.468393 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" podStartSLOduration=2.46836891 podStartE2EDuration="2.46836891s" podCreationTimestamp="2025-10-01 16:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:19:40.45983565 +0000 UTC m=+1043.459482495" watchObservedRunningTime="2025-10-01 16:19:40.46836891 +0000 UTC m=+1043.468015755" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.840887 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-78f7fcb65-9gxk4" podStartSLOduration=1.84086814 podStartE2EDuration="1.84086814s" podCreationTimestamp="2025-10-01 16:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:19:40.480990802 +0000 UTC m=+1043.480637637" watchObservedRunningTime="2025-10-01 16:19:40.84086814 +0000 UTC m=+1043.840514975" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.843814 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6bf96b65c4-djgxs"] Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.873682 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6bf96b65c4-djgxs"] Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.873797 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.876541 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.878254 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.983469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-public-tls-certs\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.983531 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adede25f-2ef2-4d24-a18b-93865063b49f-logs\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.983743 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-combined-ca-bundle\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.983951 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-config-data\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.984006 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-config-data-custom\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.984119 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zffkx\" (UniqueName: \"kubernetes.io/projected/adede25f-2ef2-4d24-a18b-93865063b49f-kube-api-access-zffkx\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:40 crc kubenswrapper[4764]: I1001 16:19:40.984148 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-internal-tls-certs\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.086336 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zffkx\" (UniqueName: \"kubernetes.io/projected/adede25f-2ef2-4d24-a18b-93865063b49f-kube-api-access-zffkx\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.086378 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-internal-tls-certs\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.086427 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-public-tls-certs\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.086453 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adede25f-2ef2-4d24-a18b-93865063b49f-logs\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.086511 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-combined-ca-bundle\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.086584 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-config-data\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.086604 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-config-data-custom\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.087974 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adede25f-2ef2-4d24-a18b-93865063b49f-logs\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.092001 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-public-tls-certs\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.092299 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-config-data-custom\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.094501 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-combined-ca-bundle\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.094593 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-internal-tls-certs\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.096995 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adede25f-2ef2-4d24-a18b-93865063b49f-config-data\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.103548 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zffkx\" (UniqueName: \"kubernetes.io/projected/adede25f-2ef2-4d24-a18b-93865063b49f-kube-api-access-zffkx\") pod \"barbican-api-6bf96b65c4-djgxs\" (UID: \"adede25f-2ef2-4d24-a18b-93865063b49f\") " pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.199858 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.458184 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" event={"ID":"446019eb-78e7-4c76-983b-44a968141080","Type":"ContainerStarted","Data":"3032de407d8c818d83075964d5116e77cdfaa5953593f0a2db3f024a363c4e33"} Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.467903 4764 generic.go:334] "Generic (PLEG): container finished" podID="7b75f147-9726-4336-8467-932ad4ff15f1" containerID="6b83c48e9be8d4f73fd5389c6ae466fdc26fe6d5e3248013ac8a3df774e9504f" exitCode=0 Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.467973 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vhgdf" event={"ID":"7b75f147-9726-4336-8467-932ad4ff15f1","Type":"ContainerDied","Data":"6b83c48e9be8d4f73fd5389c6ae466fdc26fe6d5e3248013ac8a3df774e9504f"} Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.474746 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8695dd9c7b-mwdsh" event={"ID":"d43038a4-064b-4ecf-bebf-0f4d6116a839","Type":"ContainerStarted","Data":"ad944b6cdf434a7fd7f987f291f8637994cbe9c59a13ee954a2b97fcfe2c8daa"} Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.474823 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.475114 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.479089 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56949f8bfc-chjrr" event={"ID":"f4930a41-989a-4747-b659-f35df5f73bd0","Type":"ContainerStarted","Data":"de659a78a5fc3e1f53d46e26e064cb2f3ec33b945b5eb03a4ffc434de6c8f6fa"} Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.519520 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8695dd9c7b-mwdsh" podStartSLOduration=2.519502691 podStartE2EDuration="2.519502691s" podCreationTimestamp="2025-10-01 16:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:19:41.513411111 +0000 UTC m=+1044.513057946" watchObservedRunningTime="2025-10-01 16:19:41.519502691 +0000 UTC m=+1044.519149526" Oct 01 16:19:41 crc kubenswrapper[4764]: I1001 16:19:41.665708 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6bf96b65c4-djgxs"] Oct 01 16:19:42 crc kubenswrapper[4764]: I1001 16:19:42.495465 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56949f8bfc-chjrr" event={"ID":"f4930a41-989a-4747-b659-f35df5f73bd0","Type":"ContainerStarted","Data":"7a21a8df00cc34afee70e02df5af1dd073e5b3258a8a46482b6a3654b1b93635"} Oct 01 16:19:42 crc kubenswrapper[4764]: I1001 16:19:42.497923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bf96b65c4-djgxs" event={"ID":"adede25f-2ef2-4d24-a18b-93865063b49f","Type":"ContainerStarted","Data":"2b698d15ff246f3d573b8fea7701106a7d7a6cf6711c028a29acda356efcab97"} Oct 01 16:19:42 crc kubenswrapper[4764]: I1001 16:19:42.497960 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bf96b65c4-djgxs" event={"ID":"adede25f-2ef2-4d24-a18b-93865063b49f","Type":"ContainerStarted","Data":"1ce607f9d018179fc04ecb907d47b0b598cb2646d48b5a24e11061e4afc53876"} Oct 01 16:19:42 crc kubenswrapper[4764]: I1001 16:19:42.497971 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bf96b65c4-djgxs" event={"ID":"adede25f-2ef2-4d24-a18b-93865063b49f","Type":"ContainerStarted","Data":"9561e8250747986249534cd1e2bcb23e3d733ef805f0731938907eac31d3dfe4"} Oct 01 16:19:42 crc kubenswrapper[4764]: I1001 16:19:42.498082 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:42 crc kubenswrapper[4764]: I1001 16:19:42.499915 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" event={"ID":"446019eb-78e7-4c76-983b-44a968141080","Type":"ContainerStarted","Data":"054e4796c201ac1edd3c8d9cf8ed891650bf4850aa73793e0140f11cb720667b"} Oct 01 16:19:42 crc kubenswrapper[4764]: I1001 16:19:42.521339 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-56949f8bfc-chjrr" podStartSLOduration=3.221243455 podStartE2EDuration="5.521323196s" podCreationTimestamp="2025-10-01 16:19:37 +0000 UTC" firstStartedPulling="2025-10-01 16:19:38.873874716 +0000 UTC m=+1041.873521551" lastFinishedPulling="2025-10-01 16:19:41.173954457 +0000 UTC m=+1044.173601292" observedRunningTime="2025-10-01 16:19:42.519335176 +0000 UTC m=+1045.518982011" watchObservedRunningTime="2025-10-01 16:19:42.521323196 +0000 UTC m=+1045.520970031" Oct 01 16:19:42 crc kubenswrapper[4764]: I1001 16:19:42.544227 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6bf96b65c4-djgxs" podStartSLOduration=2.54421037 podStartE2EDuration="2.54421037s" podCreationTimestamp="2025-10-01 16:19:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:19:42.539186626 +0000 UTC m=+1045.538833461" watchObservedRunningTime="2025-10-01 16:19:42.54421037 +0000 UTC m=+1045.543857205" Oct 01 16:19:42 crc kubenswrapper[4764]: I1001 16:19:42.572171 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-557bf5c9c4-gn9s8" podStartSLOduration=3.335950854 podStartE2EDuration="5.572150039s" podCreationTimestamp="2025-10-01 16:19:37 +0000 UTC" firstStartedPulling="2025-10-01 16:19:38.937777192 +0000 UTC m=+1041.937424027" lastFinishedPulling="2025-10-01 16:19:41.173976377 +0000 UTC m=+1044.173623212" observedRunningTime="2025-10-01 16:19:42.56125327 +0000 UTC m=+1045.560900115" watchObservedRunningTime="2025-10-01 16:19:42.572150039 +0000 UTC m=+1045.571796874" Oct 01 16:19:43 crc kubenswrapper[4764]: I1001 16:19:43.508083 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:46 crc kubenswrapper[4764]: I1001 16:19:46.280462 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vhgdf" Oct 01 16:19:46 crc kubenswrapper[4764]: I1001 16:19:46.403207 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txvs5\" (UniqueName: \"kubernetes.io/projected/7b75f147-9726-4336-8467-932ad4ff15f1-kube-api-access-txvs5\") pod \"7b75f147-9726-4336-8467-932ad4ff15f1\" (UID: \"7b75f147-9726-4336-8467-932ad4ff15f1\") " Oct 01 16:19:46 crc kubenswrapper[4764]: I1001 16:19:46.403312 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b75f147-9726-4336-8467-932ad4ff15f1-combined-ca-bundle\") pod \"7b75f147-9726-4336-8467-932ad4ff15f1\" (UID: \"7b75f147-9726-4336-8467-932ad4ff15f1\") " Oct 01 16:19:46 crc kubenswrapper[4764]: I1001 16:19:46.403462 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b75f147-9726-4336-8467-932ad4ff15f1-config\") pod \"7b75f147-9726-4336-8467-932ad4ff15f1\" (UID: \"7b75f147-9726-4336-8467-932ad4ff15f1\") " Oct 01 16:19:46 crc kubenswrapper[4764]: I1001 16:19:46.417044 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b75f147-9726-4336-8467-932ad4ff15f1-kube-api-access-txvs5" (OuterVolumeSpecName: "kube-api-access-txvs5") pod "7b75f147-9726-4336-8467-932ad4ff15f1" (UID: "7b75f147-9726-4336-8467-932ad4ff15f1"). InnerVolumeSpecName "kube-api-access-txvs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:46 crc kubenswrapper[4764]: I1001 16:19:46.443200 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b75f147-9726-4336-8467-932ad4ff15f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b75f147-9726-4336-8467-932ad4ff15f1" (UID: "7b75f147-9726-4336-8467-932ad4ff15f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:46 crc kubenswrapper[4764]: I1001 16:19:46.444824 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b75f147-9726-4336-8467-932ad4ff15f1-config" (OuterVolumeSpecName: "config") pod "7b75f147-9726-4336-8467-932ad4ff15f1" (UID: "7b75f147-9726-4336-8467-932ad4ff15f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:46 crc kubenswrapper[4764]: I1001 16:19:46.505948 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txvs5\" (UniqueName: \"kubernetes.io/projected/7b75f147-9726-4336-8467-932ad4ff15f1-kube-api-access-txvs5\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:46 crc kubenswrapper[4764]: I1001 16:19:46.506000 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b75f147-9726-4336-8467-932ad4ff15f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:46 crc kubenswrapper[4764]: I1001 16:19:46.506017 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b75f147-9726-4336-8467-932ad4ff15f1-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:46 crc kubenswrapper[4764]: I1001 16:19:46.553347 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vhgdf" event={"ID":"7b75f147-9726-4336-8467-932ad4ff15f1","Type":"ContainerDied","Data":"db7cbaebf1041c422fce4bcfe4c36a0137d240d7047d0707bee631d8304ead01"} Oct 01 16:19:46 crc kubenswrapper[4764]: I1001 16:19:46.553396 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db7cbaebf1041c422fce4bcfe4c36a0137d240d7047d0707bee631d8304ead01" Oct 01 16:19:46 crc kubenswrapper[4764]: I1001 16:19:46.553490 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vhgdf" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.566102 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36","Type":"ContainerStarted","Data":"8c685f97994075b41807392c919fb03103836e80b68adc8eb84dfa1f60bf774b"} Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.566537 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.566327 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="ceilometer-central-agent" containerID="cri-o://ee27e14c9fcf9a9cd496a61db0e3400419e3ee493e343739a07e44c3d3d5a12f" gracePeriod=30 Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.566345 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="proxy-httpd" containerID="cri-o://8c685f97994075b41807392c919fb03103836e80b68adc8eb84dfa1f60bf774b" gracePeriod=30 Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.566380 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="ceilometer-notification-agent" containerID="cri-o://a292679557bd3aca731b4ecc0b02840fb106ac5e6d720ca8216216439614a98e" gracePeriod=30 Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.566435 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="sg-core" containerID="cri-o://057e2cfb8a39ac039b58d73dd553fa8114c33b1fc6507e405202dba57ec85346" gracePeriod=30 Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.576449 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-rtm8g"] Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.576676 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-699df9757c-rtm8g" podUID="e8d40a39-8de0-4a53-b329-0d06086d3f7b" containerName="dnsmasq-dns" containerID="cri-o://7d2441b42055c4cb741fcd442fceedc4e4183ca7a06ef17df4cb324725763c5d" gracePeriod=10 Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.588211 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.603696 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.778414865 podStartE2EDuration="47.603681983s" podCreationTimestamp="2025-10-01 16:19:00 +0000 UTC" firstStartedPulling="2025-10-01 16:19:01.27761284 +0000 UTC m=+1004.277259675" lastFinishedPulling="2025-10-01 16:19:47.102879918 +0000 UTC m=+1050.102526793" observedRunningTime="2025-10-01 16:19:47.602840502 +0000 UTC m=+1050.602487337" watchObservedRunningTime="2025-10-01 16:19:47.603681983 +0000 UTC m=+1050.603328818" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.625834 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-9tq7b"] Oct 01 16:19:47 crc kubenswrapper[4764]: E1001 16:19:47.626208 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b75f147-9726-4336-8467-932ad4ff15f1" containerName="neutron-db-sync" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.626225 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b75f147-9726-4336-8467-932ad4ff15f1" containerName="neutron-db-sync" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.626373 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b75f147-9726-4336-8467-932ad4ff15f1" containerName="neutron-db-sync" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.627664 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.668609 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-9tq7b"] Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.720182 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-777fd8fcb-q2t4n"] Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.725414 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.728791 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-vgmcf" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.729110 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.729223 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.729337 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.734103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-config\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.734179 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsfqh\" (UniqueName: \"kubernetes.io/projected/5128f87a-55a2-419c-aa4d-3b79288c8910-kube-api-access-tsfqh\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.734211 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.734248 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-dns-svc\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.734269 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.775733 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-777fd8fcb-q2t4n"] Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.841212 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-combined-ca-bundle\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.841268 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-config\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.841317 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-config\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.841408 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsfqh\" (UniqueName: \"kubernetes.io/projected/5128f87a-55a2-419c-aa4d-3b79288c8910-kube-api-access-tsfqh\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.841429 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-httpd-config\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.841460 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.841482 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw6wr\" (UniqueName: \"kubernetes.io/projected/a44c789b-d197-43fe-ad1e-72f8d0c70cca-kube-api-access-sw6wr\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.841540 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-dns-svc\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.841564 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-ovndb-tls-certs\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.841591 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.842798 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-config\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.842854 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.842862 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-dns-svc\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.842965 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.872650 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsfqh\" (UniqueName: \"kubernetes.io/projected/5128f87a-55a2-419c-aa4d-3b79288c8910-kube-api-access-tsfqh\") pod \"dnsmasq-dns-6bb684768f-9tq7b\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.938186 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.943022 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-httpd-config\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.943115 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw6wr\" (UniqueName: \"kubernetes.io/projected/a44c789b-d197-43fe-ad1e-72f8d0c70cca-kube-api-access-sw6wr\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.943175 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-ovndb-tls-certs\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.943247 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-combined-ca-bundle\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.943276 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-config\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.950184 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-httpd-config\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.950184 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-combined-ca-bundle\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.950949 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-ovndb-tls-certs\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.953358 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-config\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:47 crc kubenswrapper[4764]: I1001 16:19:47.958526 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw6wr\" (UniqueName: \"kubernetes.io/projected/a44c789b-d197-43fe-ad1e-72f8d0c70cca-kube-api-access-sw6wr\") pod \"neutron-777fd8fcb-q2t4n\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.077267 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.091624 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.092318 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.263471 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzzm9\" (UniqueName: \"kubernetes.io/projected/e8d40a39-8de0-4a53-b329-0d06086d3f7b-kube-api-access-qzzm9\") pod \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.263570 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-dns-svc\") pod \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.263634 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-ovsdbserver-nb\") pod \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.263716 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-ovsdbserver-sb\") pod \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.263776 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-config\") pod \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\" (UID: \"e8d40a39-8de0-4a53-b329-0d06086d3f7b\") " Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.285667 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8d40a39-8de0-4a53-b329-0d06086d3f7b-kube-api-access-qzzm9" (OuterVolumeSpecName: "kube-api-access-qzzm9") pod "e8d40a39-8de0-4a53-b329-0d06086d3f7b" (UID: "e8d40a39-8de0-4a53-b329-0d06086d3f7b"). InnerVolumeSpecName "kube-api-access-qzzm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.359170 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-config" (OuterVolumeSpecName: "config") pod "e8d40a39-8de0-4a53-b329-0d06086d3f7b" (UID: "e8d40a39-8de0-4a53-b329-0d06086d3f7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.360268 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e8d40a39-8de0-4a53-b329-0d06086d3f7b" (UID: "e8d40a39-8de0-4a53-b329-0d06086d3f7b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.366744 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzzm9\" (UniqueName: \"kubernetes.io/projected/e8d40a39-8de0-4a53-b329-0d06086d3f7b-kube-api-access-qzzm9\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.366764 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.366773 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.383592 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8d40a39-8de0-4a53-b329-0d06086d3f7b" (UID: "e8d40a39-8de0-4a53-b329-0d06086d3f7b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.390440 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e8d40a39-8de0-4a53-b329-0d06086d3f7b" (UID: "e8d40a39-8de0-4a53-b329-0d06086d3f7b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.467908 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.467937 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8d40a39-8de0-4a53-b329-0d06086d3f7b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.592189 4764 generic.go:334] "Generic (PLEG): container finished" podID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerID="8c685f97994075b41807392c919fb03103836e80b68adc8eb84dfa1f60bf774b" exitCode=0 Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.592222 4764 generic.go:334] "Generic (PLEG): container finished" podID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerID="057e2cfb8a39ac039b58d73dd553fa8114c33b1fc6507e405202dba57ec85346" exitCode=2 Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.592229 4764 generic.go:334] "Generic (PLEG): container finished" podID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerID="ee27e14c9fcf9a9cd496a61db0e3400419e3ee493e343739a07e44c3d3d5a12f" exitCode=0 Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.592255 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36","Type":"ContainerDied","Data":"8c685f97994075b41807392c919fb03103836e80b68adc8eb84dfa1f60bf774b"} Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.592331 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36","Type":"ContainerDied","Data":"057e2cfb8a39ac039b58d73dd553fa8114c33b1fc6507e405202dba57ec85346"} Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.592345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36","Type":"ContainerDied","Data":"ee27e14c9fcf9a9cd496a61db0e3400419e3ee493e343739a07e44c3d3d5a12f"} Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.594958 4764 generic.go:334] "Generic (PLEG): container finished" podID="e8d40a39-8de0-4a53-b329-0d06086d3f7b" containerID="7d2441b42055c4cb741fcd442fceedc4e4183ca7a06ef17df4cb324725763c5d" exitCode=0 Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.595014 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-rtm8g" event={"ID":"e8d40a39-8de0-4a53-b329-0d06086d3f7b","Type":"ContainerDied","Data":"7d2441b42055c4cb741fcd442fceedc4e4183ca7a06ef17df4cb324725763c5d"} Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.595033 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699df9757c-rtm8g" event={"ID":"e8d40a39-8de0-4a53-b329-0d06086d3f7b","Type":"ContainerDied","Data":"297f344190ccc04a8f0c242be5f9c2d300f4dc7001b0b5742468ae32e5a08091"} Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.595113 4764 scope.go:117] "RemoveContainer" containerID="7d2441b42055c4cb741fcd442fceedc4e4183ca7a06ef17df4cb324725763c5d" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.595320 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699df9757c-rtm8g" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.629161 4764 scope.go:117] "RemoveContainer" containerID="309e442ce386e3fc0c223ccb7a4430df52fe55b424acd2184ae125f330cf009e" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.660965 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-rtm8g"] Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.671688 4764 scope.go:117] "RemoveContainer" containerID="7d2441b42055c4cb741fcd442fceedc4e4183ca7a06ef17df4cb324725763c5d" Oct 01 16:19:48 crc kubenswrapper[4764]: E1001 16:19:48.672179 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2441b42055c4cb741fcd442fceedc4e4183ca7a06ef17df4cb324725763c5d\": container with ID starting with 7d2441b42055c4cb741fcd442fceedc4e4183ca7a06ef17df4cb324725763c5d not found: ID does not exist" containerID="7d2441b42055c4cb741fcd442fceedc4e4183ca7a06ef17df4cb324725763c5d" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.672226 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2441b42055c4cb741fcd442fceedc4e4183ca7a06ef17df4cb324725763c5d"} err="failed to get container status \"7d2441b42055c4cb741fcd442fceedc4e4183ca7a06ef17df4cb324725763c5d\": rpc error: code = NotFound desc = could not find container \"7d2441b42055c4cb741fcd442fceedc4e4183ca7a06ef17df4cb324725763c5d\": container with ID starting with 7d2441b42055c4cb741fcd442fceedc4e4183ca7a06ef17df4cb324725763c5d not found: ID does not exist" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.672254 4764 scope.go:117] "RemoveContainer" containerID="309e442ce386e3fc0c223ccb7a4430df52fe55b424acd2184ae125f330cf009e" Oct 01 16:19:48 crc kubenswrapper[4764]: E1001 16:19:48.672535 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"309e442ce386e3fc0c223ccb7a4430df52fe55b424acd2184ae125f330cf009e\": container with ID starting with 309e442ce386e3fc0c223ccb7a4430df52fe55b424acd2184ae125f330cf009e not found: ID does not exist" containerID="309e442ce386e3fc0c223ccb7a4430df52fe55b424acd2184ae125f330cf009e" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.672561 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"309e442ce386e3fc0c223ccb7a4430df52fe55b424acd2184ae125f330cf009e"} err="failed to get container status \"309e442ce386e3fc0c223ccb7a4430df52fe55b424acd2184ae125f330cf009e\": rpc error: code = NotFound desc = could not find container \"309e442ce386e3fc0c223ccb7a4430df52fe55b424acd2184ae125f330cf009e\": container with ID starting with 309e442ce386e3fc0c223ccb7a4430df52fe55b424acd2184ae125f330cf009e not found: ID does not exist" Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.673840 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-699df9757c-rtm8g"] Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.806685 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-777fd8fcb-q2t4n"] Oct 01 16:19:48 crc kubenswrapper[4764]: W1001 16:19:48.824241 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda44c789b_d197_43fe_ad1e_72f8d0c70cca.slice/crio-a2e87f5585cbcd2a9d7159bbf0a18d40a9696b3c1759bef99806004e9cfb23ba WatchSource:0}: Error finding container a2e87f5585cbcd2a9d7159bbf0a18d40a9696b3c1759bef99806004e9cfb23ba: Status 404 returned error can't find the container with id a2e87f5585cbcd2a9d7159bbf0a18d40a9696b3c1759bef99806004e9cfb23ba Oct 01 16:19:48 crc kubenswrapper[4764]: I1001 16:19:48.849698 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-9tq7b"] Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.603957 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-777fd8fcb-q2t4n" event={"ID":"a44c789b-d197-43fe-ad1e-72f8d0c70cca","Type":"ContainerStarted","Data":"01b5bd367bf8914e501d8c9c4c76458e12811b1d089c86cf579a2c7aacaa4490"} Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.605793 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.606725 4764 generic.go:334] "Generic (PLEG): container finished" podID="5128f87a-55a2-419c-aa4d-3b79288c8910" containerID="8a97baa95a639c07f0ba933522b09bece922fe70ae6860602ebbb05b5d8d9b6c" exitCode=0 Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.605916 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-777fd8fcb-q2t4n" event={"ID":"a44c789b-d197-43fe-ad1e-72f8d0c70cca","Type":"ContainerStarted","Data":"856168f7c342b4d5acde2b536aa2f0028b03075a3afa724412429c09504ec5ce"} Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.607376 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-777fd8fcb-q2t4n" event={"ID":"a44c789b-d197-43fe-ad1e-72f8d0c70cca","Type":"ContainerStarted","Data":"a2e87f5585cbcd2a9d7159bbf0a18d40a9696b3c1759bef99806004e9cfb23ba"} Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.607445 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" event={"ID":"5128f87a-55a2-419c-aa4d-3b79288c8910","Type":"ContainerDied","Data":"8a97baa95a639c07f0ba933522b09bece922fe70ae6860602ebbb05b5d8d9b6c"} Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.607505 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" event={"ID":"5128f87a-55a2-419c-aa4d-3b79288c8910","Type":"ContainerStarted","Data":"ab3fbf1c31e3e8cc0a1d13e6c6ea37d318693a57807056c91aab8ccc64c36328"} Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.626818 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-777fd8fcb-q2t4n" podStartSLOduration=2.626799091 podStartE2EDuration="2.626799091s" podCreationTimestamp="2025-10-01 16:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:19:49.624650708 +0000 UTC m=+1052.624297543" watchObservedRunningTime="2025-10-01 16:19:49.626799091 +0000 UTC m=+1052.626445916" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.742130 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8d40a39-8de0-4a53-b329-0d06086d3f7b" path="/var/lib/kubelet/pods/e8d40a39-8de0-4a53-b329-0d06086d3f7b/volumes" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.777134 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5d577ff6cf-5gk59"] Oct 01 16:19:49 crc kubenswrapper[4764]: E1001 16:19:49.777631 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d40a39-8de0-4a53-b329-0d06086d3f7b" containerName="init" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.777652 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d40a39-8de0-4a53-b329-0d06086d3f7b" containerName="init" Oct 01 16:19:49 crc kubenswrapper[4764]: E1001 16:19:49.777678 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d40a39-8de0-4a53-b329-0d06086d3f7b" containerName="dnsmasq-dns" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.777685 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d40a39-8de0-4a53-b329-0d06086d3f7b" containerName="dnsmasq-dns" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.777866 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d40a39-8de0-4a53-b329-0d06086d3f7b" containerName="dnsmasq-dns" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.778871 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.782719 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.783028 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.788258 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d577ff6cf-5gk59"] Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.798941 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6bf96b65c4-djgxs" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.895684 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b8cc5d5b6-6pwzq"] Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.896144 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api-log" containerID="cri-o://f5690a0515e0af167e6266fcd08780bf17da3f2d432789175acf137324d679e7" gracePeriod=30 Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.896552 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api" containerID="cri-o://790fd192ed7fdab21806ebc4f57bd21352d68d0579e641b72cd2e538b04b0473" gracePeriod=30 Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.901972 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-internal-tls-certs\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.902217 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-public-tls-certs\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.902336 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-ovndb-tls-certs\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.902411 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-config\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.902522 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-httpd-config\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.902633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-combined-ca-bundle\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.902708 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c2dg\" (UniqueName: \"kubernetes.io/projected/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-kube-api-access-4c2dg\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.913392 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": EOF" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.917440 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": EOF" Oct 01 16:19:49 crc kubenswrapper[4764]: I1001 16:19:49.917470 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": EOF" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.015744 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-internal-tls-certs\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.015814 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-public-tls-certs\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.015863 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-ovndb-tls-certs\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.015884 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-config\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.015929 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-httpd-config\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.015972 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-combined-ca-bundle\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.015997 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c2dg\" (UniqueName: \"kubernetes.io/projected/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-kube-api-access-4c2dg\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.024242 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-internal-tls-certs\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.030807 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-httpd-config\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.035910 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-config\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.037321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-ovndb-tls-certs\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.038816 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-combined-ca-bundle\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.039667 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c2dg\" (UniqueName: \"kubernetes.io/projected/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-kube-api-access-4c2dg\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.047183 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acde2ba2-32bc-4d80-aa8d-dd6505c14da3-public-tls-certs\") pod \"neutron-5d577ff6cf-5gk59\" (UID: \"acde2ba2-32bc-4d80-aa8d-dd6505c14da3\") " pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.097125 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.618851 4764 generic.go:334] "Generic (PLEG): container finished" podID="89f5b048-56d4-433e-8be7-899ca92803c0" containerID="f5690a0515e0af167e6266fcd08780bf17da3f2d432789175acf137324d679e7" exitCode=143 Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.618956 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" event={"ID":"89f5b048-56d4-433e-8be7-899ca92803c0","Type":"ContainerDied","Data":"f5690a0515e0af167e6266fcd08780bf17da3f2d432789175acf137324d679e7"} Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.621509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" event={"ID":"5128f87a-55a2-419c-aa4d-3b79288c8910","Type":"ContainerStarted","Data":"4711db450b745531ad6b2458236fdd08c7c1146b120434cdae64a8b81ced7236"} Oct 01 16:19:50 crc kubenswrapper[4764]: W1001 16:19:50.649362 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacde2ba2_32bc_4d80_aa8d_dd6505c14da3.slice/crio-d79aae460a9450d9f0f8d47dc2a3af346667063c66f98dc321dee4fbd5842967 WatchSource:0}: Error finding container d79aae460a9450d9f0f8d47dc2a3af346667063c66f98dc321dee4fbd5842967: Status 404 returned error can't find the container with id d79aae460a9450d9f0f8d47dc2a3af346667063c66f98dc321dee4fbd5842967 Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.649893 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d577ff6cf-5gk59"] Oct 01 16:19:50 crc kubenswrapper[4764]: I1001 16:19:50.651345 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" podStartSLOduration=3.6513258349999997 podStartE2EDuration="3.651325835s" podCreationTimestamp="2025-10-01 16:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:19:50.643786859 +0000 UTC m=+1053.643433744" watchObservedRunningTime="2025-10-01 16:19:50.651325835 +0000 UTC m=+1053.650972690" Oct 01 16:19:51 crc kubenswrapper[4764]: I1001 16:19:51.631186 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d577ff6cf-5gk59" event={"ID":"acde2ba2-32bc-4d80-aa8d-dd6505c14da3","Type":"ContainerStarted","Data":"abaaf032385877fb96a5848f38ed3e165a4829a3f8a314039dd6e3ed2f8b5192"} Oct 01 16:19:51 crc kubenswrapper[4764]: I1001 16:19:51.631830 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d577ff6cf-5gk59" event={"ID":"acde2ba2-32bc-4d80-aa8d-dd6505c14da3","Type":"ContainerStarted","Data":"374716d809cf004e5ae4c2bd5f603ae396933108946be8bccbdea5b8ff161b6c"} Oct 01 16:19:51 crc kubenswrapper[4764]: I1001 16:19:51.631847 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d577ff6cf-5gk59" event={"ID":"acde2ba2-32bc-4d80-aa8d-dd6505c14da3","Type":"ContainerStarted","Data":"d79aae460a9450d9f0f8d47dc2a3af346667063c66f98dc321dee4fbd5842967"} Oct 01 16:19:51 crc kubenswrapper[4764]: I1001 16:19:51.632093 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:51 crc kubenswrapper[4764]: I1001 16:19:51.659904 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5d577ff6cf-5gk59" podStartSLOduration=2.659886976 podStartE2EDuration="2.659886976s" podCreationTimestamp="2025-10-01 16:19:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:19:51.658513352 +0000 UTC m=+1054.658160177" watchObservedRunningTime="2025-10-01 16:19:51.659886976 +0000 UTC m=+1054.659533811" Oct 01 16:19:51 crc kubenswrapper[4764]: I1001 16:19:51.914426 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:19:51 crc kubenswrapper[4764]: I1001 16:19:51.914485 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:19:51 crc kubenswrapper[4764]: I1001 16:19:51.914536 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:19:51 crc kubenswrapper[4764]: I1001 16:19:51.915194 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"996ccf5d7c8e2755552554ff5a74e5db9102336da04bc8666a5ec3ad70d33d62"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:19:51 crc kubenswrapper[4764]: I1001 16:19:51.915280 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://996ccf5d7c8e2755552554ff5a74e5db9102336da04bc8666a5ec3ad70d33d62" gracePeriod=600 Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.189256 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.263191 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-scripts\") pod \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.263288 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76tpp\" (UniqueName: \"kubernetes.io/projected/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-kube-api-access-76tpp\") pod \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.263340 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-log-httpd\") pod \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.263382 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-config-data\") pod \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.263398 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-sg-core-conf-yaml\") pod \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.263460 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-run-httpd\") pod \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.263536 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-combined-ca-bundle\") pod \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\" (UID: \"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36\") " Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.263984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" (UID: "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.264627 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" (UID: "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.269542 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-kube-api-access-76tpp" (OuterVolumeSpecName: "kube-api-access-76tpp") pod "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" (UID: "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36"). InnerVolumeSpecName "kube-api-access-76tpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.289131 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-scripts" (OuterVolumeSpecName: "scripts") pod "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" (UID: "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.294248 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" (UID: "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.366757 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.367235 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76tpp\" (UniqueName: \"kubernetes.io/projected/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-kube-api-access-76tpp\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.367349 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.367504 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.367578 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.384150 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" (UID: "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.396238 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-config-data" (OuterVolumeSpecName: "config-data") pod "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" (UID: "a06e2493-1da1-4fd7-a4aa-1984c2dc7d36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.469549 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.470772 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.646553 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="996ccf5d7c8e2755552554ff5a74e5db9102336da04bc8666a5ec3ad70d33d62" exitCode=0 Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.646625 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"996ccf5d7c8e2755552554ff5a74e5db9102336da04bc8666a5ec3ad70d33d62"} Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.646844 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"98c11f42deb2a855802db6e539c07b78ed64042cc307603fe80868f20ffd6d4f"} Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.646861 4764 scope.go:117] "RemoveContainer" containerID="36994ceb1acaf44344047ef2a5795d007fa57999fff00a6c7967859219769b54" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.656890 4764 generic.go:334] "Generic (PLEG): container finished" podID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerID="a292679557bd3aca731b4ecc0b02840fb106ac5e6d720ca8216216439614a98e" exitCode=0 Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.656929 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.656965 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36","Type":"ContainerDied","Data":"a292679557bd3aca731b4ecc0b02840fb106ac5e6d720ca8216216439614a98e"} Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.657013 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a06e2493-1da1-4fd7-a4aa-1984c2dc7d36","Type":"ContainerDied","Data":"33f5886a93b0c0196860535fd7934dbcc69fb09d8ef6473018649230834bdb17"} Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.657810 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.691390 4764 scope.go:117] "RemoveContainer" containerID="8c685f97994075b41807392c919fb03103836e80b68adc8eb84dfa1f60bf774b" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.700707 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.712999 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.721687 4764 scope.go:117] "RemoveContainer" containerID="057e2cfb8a39ac039b58d73dd553fa8114c33b1fc6507e405202dba57ec85346" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.737803 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:19:52 crc kubenswrapper[4764]: E1001 16:19:52.738211 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="ceilometer-notification-agent" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.738228 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="ceilometer-notification-agent" Oct 01 16:19:52 crc kubenswrapper[4764]: E1001 16:19:52.738241 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="sg-core" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.738247 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="sg-core" Oct 01 16:19:52 crc kubenswrapper[4764]: E1001 16:19:52.738264 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="ceilometer-central-agent" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.738271 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="ceilometer-central-agent" Oct 01 16:19:52 crc kubenswrapper[4764]: E1001 16:19:52.738289 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="proxy-httpd" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.738295 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="proxy-httpd" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.738452 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="ceilometer-notification-agent" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.738467 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="sg-core" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.738483 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="ceilometer-central-agent" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.738496 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" containerName="proxy-httpd" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.739955 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.746420 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.747685 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.748476 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.776202 4764 scope.go:117] "RemoveContainer" containerID="a292679557bd3aca731b4ecc0b02840fb106ac5e6d720ca8216216439614a98e" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.800081 4764 scope.go:117] "RemoveContainer" containerID="ee27e14c9fcf9a9cd496a61db0e3400419e3ee493e343739a07e44c3d3d5a12f" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.819170 4764 scope.go:117] "RemoveContainer" containerID="8c685f97994075b41807392c919fb03103836e80b68adc8eb84dfa1f60bf774b" Oct 01 16:19:52 crc kubenswrapper[4764]: E1001 16:19:52.819593 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c685f97994075b41807392c919fb03103836e80b68adc8eb84dfa1f60bf774b\": container with ID starting with 8c685f97994075b41807392c919fb03103836e80b68adc8eb84dfa1f60bf774b not found: ID does not exist" containerID="8c685f97994075b41807392c919fb03103836e80b68adc8eb84dfa1f60bf774b" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.819625 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c685f97994075b41807392c919fb03103836e80b68adc8eb84dfa1f60bf774b"} err="failed to get container status \"8c685f97994075b41807392c919fb03103836e80b68adc8eb84dfa1f60bf774b\": rpc error: code = NotFound desc = could not find container \"8c685f97994075b41807392c919fb03103836e80b68adc8eb84dfa1f60bf774b\": container with ID starting with 8c685f97994075b41807392c919fb03103836e80b68adc8eb84dfa1f60bf774b not found: ID does not exist" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.819646 4764 scope.go:117] "RemoveContainer" containerID="057e2cfb8a39ac039b58d73dd553fa8114c33b1fc6507e405202dba57ec85346" Oct 01 16:19:52 crc kubenswrapper[4764]: E1001 16:19:52.819847 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"057e2cfb8a39ac039b58d73dd553fa8114c33b1fc6507e405202dba57ec85346\": container with ID starting with 057e2cfb8a39ac039b58d73dd553fa8114c33b1fc6507e405202dba57ec85346 not found: ID does not exist" containerID="057e2cfb8a39ac039b58d73dd553fa8114c33b1fc6507e405202dba57ec85346" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.819869 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"057e2cfb8a39ac039b58d73dd553fa8114c33b1fc6507e405202dba57ec85346"} err="failed to get container status \"057e2cfb8a39ac039b58d73dd553fa8114c33b1fc6507e405202dba57ec85346\": rpc error: code = NotFound desc = could not find container \"057e2cfb8a39ac039b58d73dd553fa8114c33b1fc6507e405202dba57ec85346\": container with ID starting with 057e2cfb8a39ac039b58d73dd553fa8114c33b1fc6507e405202dba57ec85346 not found: ID does not exist" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.819882 4764 scope.go:117] "RemoveContainer" containerID="a292679557bd3aca731b4ecc0b02840fb106ac5e6d720ca8216216439614a98e" Oct 01 16:19:52 crc kubenswrapper[4764]: E1001 16:19:52.820076 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a292679557bd3aca731b4ecc0b02840fb106ac5e6d720ca8216216439614a98e\": container with ID starting with a292679557bd3aca731b4ecc0b02840fb106ac5e6d720ca8216216439614a98e not found: ID does not exist" containerID="a292679557bd3aca731b4ecc0b02840fb106ac5e6d720ca8216216439614a98e" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.820094 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a292679557bd3aca731b4ecc0b02840fb106ac5e6d720ca8216216439614a98e"} err="failed to get container status \"a292679557bd3aca731b4ecc0b02840fb106ac5e6d720ca8216216439614a98e\": rpc error: code = NotFound desc = could not find container \"a292679557bd3aca731b4ecc0b02840fb106ac5e6d720ca8216216439614a98e\": container with ID starting with a292679557bd3aca731b4ecc0b02840fb106ac5e6d720ca8216216439614a98e not found: ID does not exist" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.820107 4764 scope.go:117] "RemoveContainer" containerID="ee27e14c9fcf9a9cd496a61db0e3400419e3ee493e343739a07e44c3d3d5a12f" Oct 01 16:19:52 crc kubenswrapper[4764]: E1001 16:19:52.821468 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee27e14c9fcf9a9cd496a61db0e3400419e3ee493e343739a07e44c3d3d5a12f\": container with ID starting with ee27e14c9fcf9a9cd496a61db0e3400419e3ee493e343739a07e44c3d3d5a12f not found: ID does not exist" containerID="ee27e14c9fcf9a9cd496a61db0e3400419e3ee493e343739a07e44c3d3d5a12f" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.821490 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee27e14c9fcf9a9cd496a61db0e3400419e3ee493e343739a07e44c3d3d5a12f"} err="failed to get container status \"ee27e14c9fcf9a9cd496a61db0e3400419e3ee493e343739a07e44c3d3d5a12f\": rpc error: code = NotFound desc = could not find container \"ee27e14c9fcf9a9cd496a61db0e3400419e3ee493e343739a07e44c3d3d5a12f\": container with ID starting with ee27e14c9fcf9a9cd496a61db0e3400419e3ee493e343739a07e44c3d3d5a12f not found: ID does not exist" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.877807 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-scripts\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.878065 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.878126 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fe3fe7-36a4-4161-820e-dd33c7119f48-log-httpd\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.878151 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fe3fe7-36a4-4161-820e-dd33c7119f48-run-httpd\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.878239 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.878279 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8h9n\" (UniqueName: \"kubernetes.io/projected/d3fe3fe7-36a4-4161-820e-dd33c7119f48-kube-api-access-b8h9n\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.878328 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-config-data\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.980944 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fe3fe7-36a4-4161-820e-dd33c7119f48-log-httpd\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.981032 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fe3fe7-36a4-4161-820e-dd33c7119f48-run-httpd\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.981175 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.981242 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8h9n\" (UniqueName: \"kubernetes.io/projected/d3fe3fe7-36a4-4161-820e-dd33c7119f48-kube-api-access-b8h9n\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.981374 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-config-data\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.981436 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-scripts\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.981474 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.982008 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fe3fe7-36a4-4161-820e-dd33c7119f48-log-httpd\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.983313 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fe3fe7-36a4-4161-820e-dd33c7119f48-run-httpd\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.988896 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-scripts\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.989548 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-config-data\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.989910 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:52 crc kubenswrapper[4764]: I1001 16:19:52.992416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:53 crc kubenswrapper[4764]: I1001 16:19:53.003511 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8h9n\" (UniqueName: \"kubernetes.io/projected/d3fe3fe7-36a4-4161-820e-dd33c7119f48-kube-api-access-b8h9n\") pod \"ceilometer-0\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " pod="openstack/ceilometer-0" Oct 01 16:19:53 crc kubenswrapper[4764]: I1001 16:19:53.067265 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:19:53 crc kubenswrapper[4764]: I1001 16:19:53.503902 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:19:53 crc kubenswrapper[4764]: W1001 16:19:53.519965 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3fe3fe7_36a4_4161_820e_dd33c7119f48.slice/crio-07bd4432280db87da4349f717a12a1ce9d137e4997b1405fde419e749fba758b WatchSource:0}: Error finding container 07bd4432280db87da4349f717a12a1ce9d137e4997b1405fde419e749fba758b: Status 404 returned error can't find the container with id 07bd4432280db87da4349f717a12a1ce9d137e4997b1405fde419e749fba758b Oct 01 16:19:53 crc kubenswrapper[4764]: I1001 16:19:53.670942 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dkmsh" event={"ID":"44b571b7-d584-46bf-823a-bf8ce35c8dac","Type":"ContainerStarted","Data":"ece924ff63c46929ea0d32f523dea4e58ae3f16cc62d0a2d2ed1b6216573e35d"} Oct 01 16:19:53 crc kubenswrapper[4764]: I1001 16:19:53.672290 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fe3fe7-36a4-4161-820e-dd33c7119f48","Type":"ContainerStarted","Data":"07bd4432280db87da4349f717a12a1ce9d137e4997b1405fde419e749fba758b"} Oct 01 16:19:53 crc kubenswrapper[4764]: I1001 16:19:53.730554 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a06e2493-1da1-4fd7-a4aa-1984c2dc7d36" path="/var/lib/kubelet/pods/a06e2493-1da1-4fd7-a4aa-1984c2dc7d36/volumes" Oct 01 16:19:54 crc kubenswrapper[4764]: I1001 16:19:54.704634 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fe3fe7-36a4-4161-820e-dd33c7119f48","Type":"ContainerStarted","Data":"f411eeaccf2673f77e018c8ef0e2682d69cb487f4a01d49604fa8a15ae088271"} Oct 01 16:19:54 crc kubenswrapper[4764]: I1001 16:19:54.959281 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.319133 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": read tcp 10.217.0.2:52332->10.217.0.144:9311: read: connection reset by peer" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.319186 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.144:9311/healthcheck\": read tcp 10.217.0.2:56168->10.217.0.144:9311: read: connection reset by peer" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.714211 4764 generic.go:334] "Generic (PLEG): container finished" podID="89f5b048-56d4-433e-8be7-899ca92803c0" containerID="790fd192ed7fdab21806ebc4f57bd21352d68d0579e641b72cd2e538b04b0473" exitCode=0 Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.714274 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" event={"ID":"89f5b048-56d4-433e-8be7-899ca92803c0","Type":"ContainerDied","Data":"790fd192ed7fdab21806ebc4f57bd21352d68d0579e641b72cd2e538b04b0473"} Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.714630 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" event={"ID":"89f5b048-56d4-433e-8be7-899ca92803c0","Type":"ContainerDied","Data":"c96b0d20637407de7b2bde95ff995a7a5eddf448cb96192faab61876c57b38f9"} Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.714645 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c96b0d20637407de7b2bde95ff995a7a5eddf448cb96192faab61876c57b38f9" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.717270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fe3fe7-36a4-4161-820e-dd33c7119f48","Type":"ContainerStarted","Data":"2b9a04945a62ae4269a8453f474933fb79cc9b5c4eeb9991d33dd991e33ea7cc"} Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.717297 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fe3fe7-36a4-4161-820e-dd33c7119f48","Type":"ContainerStarted","Data":"2e716dbca9226de637d42288bfc901dd242bfd4cff8aa515021580864b56c3b9"} Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.743246 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.789168 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-dkmsh" podStartSLOduration=4.170454938 podStartE2EDuration="49.789143001s" podCreationTimestamp="2025-10-01 16:19:06 +0000 UTC" firstStartedPulling="2025-10-01 16:19:06.939273337 +0000 UTC m=+1009.938920172" lastFinishedPulling="2025-10-01 16:19:52.5579614 +0000 UTC m=+1055.557608235" observedRunningTime="2025-10-01 16:19:53.704677949 +0000 UTC m=+1056.704324784" watchObservedRunningTime="2025-10-01 16:19:55.789143001 +0000 UTC m=+1058.788789836" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.830612 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89f5b048-56d4-433e-8be7-899ca92803c0-logs\") pod \"89f5b048-56d4-433e-8be7-899ca92803c0\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.830745 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-config-data\") pod \"89f5b048-56d4-433e-8be7-899ca92803c0\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.830786 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-config-data-custom\") pod \"89f5b048-56d4-433e-8be7-899ca92803c0\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.830857 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bblc\" (UniqueName: \"kubernetes.io/projected/89f5b048-56d4-433e-8be7-899ca92803c0-kube-api-access-5bblc\") pod \"89f5b048-56d4-433e-8be7-899ca92803c0\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.830913 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-combined-ca-bundle\") pod \"89f5b048-56d4-433e-8be7-899ca92803c0\" (UID: \"89f5b048-56d4-433e-8be7-899ca92803c0\") " Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.831159 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89f5b048-56d4-433e-8be7-899ca92803c0-logs" (OuterVolumeSpecName: "logs") pod "89f5b048-56d4-433e-8be7-899ca92803c0" (UID: "89f5b048-56d4-433e-8be7-899ca92803c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.831539 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89f5b048-56d4-433e-8be7-899ca92803c0-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.834944 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89f5b048-56d4-433e-8be7-899ca92803c0-kube-api-access-5bblc" (OuterVolumeSpecName: "kube-api-access-5bblc") pod "89f5b048-56d4-433e-8be7-899ca92803c0" (UID: "89f5b048-56d4-433e-8be7-899ca92803c0"). InnerVolumeSpecName "kube-api-access-5bblc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.838125 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "89f5b048-56d4-433e-8be7-899ca92803c0" (UID: "89f5b048-56d4-433e-8be7-899ca92803c0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.858600 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89f5b048-56d4-433e-8be7-899ca92803c0" (UID: "89f5b048-56d4-433e-8be7-899ca92803c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.887727 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-config-data" (OuterVolumeSpecName: "config-data") pod "89f5b048-56d4-433e-8be7-899ca92803c0" (UID: "89f5b048-56d4-433e-8be7-899ca92803c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.932992 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.933022 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.933033 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bblc\" (UniqueName: \"kubernetes.io/projected/89f5b048-56d4-433e-8be7-899ca92803c0-kube-api-access-5bblc\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:55 crc kubenswrapper[4764]: I1001 16:19:55.933041 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f5b048-56d4-433e-8be7-899ca92803c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:56 crc kubenswrapper[4764]: I1001 16:19:56.725552 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b8cc5d5b6-6pwzq" Oct 01 16:19:56 crc kubenswrapper[4764]: I1001 16:19:56.778479 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7b8cc5d5b6-6pwzq"] Oct 01 16:19:56 crc kubenswrapper[4764]: I1001 16:19:56.790661 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7b8cc5d5b6-6pwzq"] Oct 01 16:19:57 crc kubenswrapper[4764]: I1001 16:19:57.740278 4764 generic.go:334] "Generic (PLEG): container finished" podID="44b571b7-d584-46bf-823a-bf8ce35c8dac" containerID="ece924ff63c46929ea0d32f523dea4e58ae3f16cc62d0a2d2ed1b6216573e35d" exitCode=0 Oct 01 16:19:57 crc kubenswrapper[4764]: I1001 16:19:57.755414 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" path="/var/lib/kubelet/pods/89f5b048-56d4-433e-8be7-899ca92803c0/volumes" Oct 01 16:19:57 crc kubenswrapper[4764]: I1001 16:19:57.756980 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dkmsh" event={"ID":"44b571b7-d584-46bf-823a-bf8ce35c8dac","Type":"ContainerDied","Data":"ece924ff63c46929ea0d32f523dea4e58ae3f16cc62d0a2d2ed1b6216573e35d"} Oct 01 16:19:57 crc kubenswrapper[4764]: I1001 16:19:57.757085 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fe3fe7-36a4-4161-820e-dd33c7119f48","Type":"ContainerStarted","Data":"e53efd7b4cd2e4ac617486513058a911ccdd098013a3054796e5754aa44356a4"} Oct 01 16:19:57 crc kubenswrapper[4764]: I1001 16:19:57.758716 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 16:19:57 crc kubenswrapper[4764]: I1001 16:19:57.836892 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.747533545 podStartE2EDuration="5.836875086s" podCreationTimestamp="2025-10-01 16:19:52 +0000 UTC" firstStartedPulling="2025-10-01 16:19:53.526643137 +0000 UTC m=+1056.526289972" lastFinishedPulling="2025-10-01 16:19:56.615984678 +0000 UTC m=+1059.615631513" observedRunningTime="2025-10-01 16:19:57.820921973 +0000 UTC m=+1060.820568858" watchObservedRunningTime="2025-10-01 16:19:57.836875086 +0000 UTC m=+1060.836521921" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.081024 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.190066 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-7gc4w"] Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.190288 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" podUID="ef4b073a-e109-47ee-9bd2-45c92b329310" containerName="dnsmasq-dns" containerID="cri-o://53017b083413fbaf0018f97d432b1f5f992bdfd0e91b209546c7e3d231e64187" gracePeriod=10 Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.729448 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.772947 4764 generic.go:334] "Generic (PLEG): container finished" podID="ef4b073a-e109-47ee-9bd2-45c92b329310" containerID="53017b083413fbaf0018f97d432b1f5f992bdfd0e91b209546c7e3d231e64187" exitCode=0 Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.773088 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.773102 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" event={"ID":"ef4b073a-e109-47ee-9bd2-45c92b329310","Type":"ContainerDied","Data":"53017b083413fbaf0018f97d432b1f5f992bdfd0e91b209546c7e3d231e64187"} Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.773153 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-7gc4w" event={"ID":"ef4b073a-e109-47ee-9bd2-45c92b329310","Type":"ContainerDied","Data":"419f7ac0d09514c45b9c26444bb84a0161ea6e0a31b8b5b3fba0b83f707f081f"} Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.773171 4764 scope.go:117] "RemoveContainer" containerID="53017b083413fbaf0018f97d432b1f5f992bdfd0e91b209546c7e3d231e64187" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.792878 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-ovsdbserver-sb\") pod \"ef4b073a-e109-47ee-9bd2-45c92b329310\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.792965 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-config\") pod \"ef4b073a-e109-47ee-9bd2-45c92b329310\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.793070 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz695\" (UniqueName: \"kubernetes.io/projected/ef4b073a-e109-47ee-9bd2-45c92b329310-kube-api-access-bz695\") pod \"ef4b073a-e109-47ee-9bd2-45c92b329310\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.793213 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-dns-svc\") pod \"ef4b073a-e109-47ee-9bd2-45c92b329310\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.793245 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-ovsdbserver-nb\") pod \"ef4b073a-e109-47ee-9bd2-45c92b329310\" (UID: \"ef4b073a-e109-47ee-9bd2-45c92b329310\") " Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.804125 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef4b073a-e109-47ee-9bd2-45c92b329310-kube-api-access-bz695" (OuterVolumeSpecName: "kube-api-access-bz695") pod "ef4b073a-e109-47ee-9bd2-45c92b329310" (UID: "ef4b073a-e109-47ee-9bd2-45c92b329310"). InnerVolumeSpecName "kube-api-access-bz695". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.809041 4764 scope.go:117] "RemoveContainer" containerID="26df21f29710caec04d2e507648077fd343797764f63abe76c972969364d13e5" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.868931 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ef4b073a-e109-47ee-9bd2-45c92b329310" (UID: "ef4b073a-e109-47ee-9bd2-45c92b329310"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.869146 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ef4b073a-e109-47ee-9bd2-45c92b329310" (UID: "ef4b073a-e109-47ee-9bd2-45c92b329310"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.877858 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-config" (OuterVolumeSpecName: "config") pod "ef4b073a-e109-47ee-9bd2-45c92b329310" (UID: "ef4b073a-e109-47ee-9bd2-45c92b329310"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.895872 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.895905 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.895914 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz695\" (UniqueName: \"kubernetes.io/projected/ef4b073a-e109-47ee-9bd2-45c92b329310-kube-api-access-bz695\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.895925 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.898286 4764 scope.go:117] "RemoveContainer" containerID="53017b083413fbaf0018f97d432b1f5f992bdfd0e91b209546c7e3d231e64187" Oct 01 16:19:58 crc kubenswrapper[4764]: E1001 16:19:58.899783 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53017b083413fbaf0018f97d432b1f5f992bdfd0e91b209546c7e3d231e64187\": container with ID starting with 53017b083413fbaf0018f97d432b1f5f992bdfd0e91b209546c7e3d231e64187 not found: ID does not exist" containerID="53017b083413fbaf0018f97d432b1f5f992bdfd0e91b209546c7e3d231e64187" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.899823 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53017b083413fbaf0018f97d432b1f5f992bdfd0e91b209546c7e3d231e64187"} err="failed to get container status \"53017b083413fbaf0018f97d432b1f5f992bdfd0e91b209546c7e3d231e64187\": rpc error: code = NotFound desc = could not find container \"53017b083413fbaf0018f97d432b1f5f992bdfd0e91b209546c7e3d231e64187\": container with ID starting with 53017b083413fbaf0018f97d432b1f5f992bdfd0e91b209546c7e3d231e64187 not found: ID does not exist" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.899849 4764 scope.go:117] "RemoveContainer" containerID="26df21f29710caec04d2e507648077fd343797764f63abe76c972969364d13e5" Oct 01 16:19:58 crc kubenswrapper[4764]: E1001 16:19:58.900081 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26df21f29710caec04d2e507648077fd343797764f63abe76c972969364d13e5\": container with ID starting with 26df21f29710caec04d2e507648077fd343797764f63abe76c972969364d13e5 not found: ID does not exist" containerID="26df21f29710caec04d2e507648077fd343797764f63abe76c972969364d13e5" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.900108 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26df21f29710caec04d2e507648077fd343797764f63abe76c972969364d13e5"} err="failed to get container status \"26df21f29710caec04d2e507648077fd343797764f63abe76c972969364d13e5\": rpc error: code = NotFound desc = could not find container \"26df21f29710caec04d2e507648077fd343797764f63abe76c972969364d13e5\": container with ID starting with 26df21f29710caec04d2e507648077fd343797764f63abe76c972969364d13e5 not found: ID does not exist" Oct 01 16:19:58 crc kubenswrapper[4764]: I1001 16:19:58.909556 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef4b073a-e109-47ee-9bd2-45c92b329310" (UID: "ef4b073a-e109-47ee-9bd2-45c92b329310"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.000075 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef4b073a-e109-47ee-9bd2-45c92b329310-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.087977 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.147211 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-7gc4w"] Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.161170 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-7gc4w"] Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.203672 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd7s7\" (UniqueName: \"kubernetes.io/projected/44b571b7-d584-46bf-823a-bf8ce35c8dac-kube-api-access-bd7s7\") pod \"44b571b7-d584-46bf-823a-bf8ce35c8dac\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.204275 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-db-sync-config-data\") pod \"44b571b7-d584-46bf-823a-bf8ce35c8dac\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.204348 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44b571b7-d584-46bf-823a-bf8ce35c8dac-etc-machine-id\") pod \"44b571b7-d584-46bf-823a-bf8ce35c8dac\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.204466 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-config-data\") pod \"44b571b7-d584-46bf-823a-bf8ce35c8dac\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.204525 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/44b571b7-d584-46bf-823a-bf8ce35c8dac-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "44b571b7-d584-46bf-823a-bf8ce35c8dac" (UID: "44b571b7-d584-46bf-823a-bf8ce35c8dac"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.208333 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-scripts\") pod \"44b571b7-d584-46bf-823a-bf8ce35c8dac\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.208414 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-combined-ca-bundle\") pod \"44b571b7-d584-46bf-823a-bf8ce35c8dac\" (UID: \"44b571b7-d584-46bf-823a-bf8ce35c8dac\") " Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.209188 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/44b571b7-d584-46bf-823a-bf8ce35c8dac-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.209656 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "44b571b7-d584-46bf-823a-bf8ce35c8dac" (UID: "44b571b7-d584-46bf-823a-bf8ce35c8dac"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.215262 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-scripts" (OuterVolumeSpecName: "scripts") pod "44b571b7-d584-46bf-823a-bf8ce35c8dac" (UID: "44b571b7-d584-46bf-823a-bf8ce35c8dac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.218208 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b571b7-d584-46bf-823a-bf8ce35c8dac-kube-api-access-bd7s7" (OuterVolumeSpecName: "kube-api-access-bd7s7") pod "44b571b7-d584-46bf-823a-bf8ce35c8dac" (UID: "44b571b7-d584-46bf-823a-bf8ce35c8dac"). InnerVolumeSpecName "kube-api-access-bd7s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.247379 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44b571b7-d584-46bf-823a-bf8ce35c8dac" (UID: "44b571b7-d584-46bf-823a-bf8ce35c8dac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.273195 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-config-data" (OuterVolumeSpecName: "config-data") pod "44b571b7-d584-46bf-823a-bf8ce35c8dac" (UID: "44b571b7-d584-46bf-823a-bf8ce35c8dac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.310174 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.310204 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.310212 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.310224 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd7s7\" (UniqueName: \"kubernetes.io/projected/44b571b7-d584-46bf-823a-bf8ce35c8dac-kube-api-access-bd7s7\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.310232 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44b571b7-d584-46bf-823a-bf8ce35c8dac-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.744928 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef4b073a-e109-47ee-9bd2-45c92b329310" path="/var/lib/kubelet/pods/ef4b073a-e109-47ee-9bd2-45c92b329310/volumes" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.790031 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dkmsh" event={"ID":"44b571b7-d584-46bf-823a-bf8ce35c8dac","Type":"ContainerDied","Data":"5f6aa17b485253db85ce363bb590ee5d8113fe180475145e1b87369522c5cf22"} Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.790111 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f6aa17b485253db85ce363bb590ee5d8113fe180475145e1b87369522c5cf22" Oct 01 16:19:59 crc kubenswrapper[4764]: I1001 16:19:59.797752 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dkmsh" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.057301 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 16:20:00 crc kubenswrapper[4764]: E1001 16:20:00.057628 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b571b7-d584-46bf-823a-bf8ce35c8dac" containerName="cinder-db-sync" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.057644 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b571b7-d584-46bf-823a-bf8ce35c8dac" containerName="cinder-db-sync" Oct 01 16:20:00 crc kubenswrapper[4764]: E1001 16:20:00.057656 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api-log" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.057662 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api-log" Oct 01 16:20:00 crc kubenswrapper[4764]: E1001 16:20:00.057671 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4b073a-e109-47ee-9bd2-45c92b329310" containerName="init" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.057678 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4b073a-e109-47ee-9bd2-45c92b329310" containerName="init" Oct 01 16:20:00 crc kubenswrapper[4764]: E1001 16:20:00.057690 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4b073a-e109-47ee-9bd2-45c92b329310" containerName="dnsmasq-dns" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.057695 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4b073a-e109-47ee-9bd2-45c92b329310" containerName="dnsmasq-dns" Oct 01 16:20:00 crc kubenswrapper[4764]: E1001 16:20:00.057711 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.057717 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.057856 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b571b7-d584-46bf-823a-bf8ce35c8dac" containerName="cinder-db-sync" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.057870 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api-log" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.057884 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef4b073a-e109-47ee-9bd2-45c92b329310" containerName="dnsmasq-dns" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.057895 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="89f5b048-56d4-433e-8be7-899ca92803c0" containerName="barbican-api" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.058711 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.062255 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.062328 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.062465 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-p6tk8" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.063889 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.084105 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.136770 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.136830 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfngh\" (UniqueName: \"kubernetes.io/projected/98f03dca-8e46-4db3-a187-6d56924bd91e-kube-api-access-wfngh\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.136851 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.137090 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-config-data\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.137198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98f03dca-8e46-4db3-a187-6d56924bd91e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.137305 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-scripts\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.165033 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-s6tcb"] Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.166431 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.178937 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-s6tcb"] Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.241171 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98f03dca-8e46-4db3-a187-6d56924bd91e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.241211 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-scripts\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.241338 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.241358 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.241394 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98f03dca-8e46-4db3-a187-6d56924bd91e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.241414 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-config\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.241545 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.241641 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw89j\" (UniqueName: \"kubernetes.io/projected/79e70d5f-7ec6-4c79-9044-b9495bf01054-kube-api-access-tw89j\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.241678 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.241722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfngh\" (UniqueName: \"kubernetes.io/projected/98f03dca-8e46-4db3-a187-6d56924bd91e-kube-api-access-wfngh\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.241753 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.241858 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-config-data\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.244966 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-scripts\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.245100 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.248229 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-config-data\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.261485 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.261611 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfngh\" (UniqueName: \"kubernetes.io/projected/98f03dca-8e46-4db3-a187-6d56924bd91e-kube-api-access-wfngh\") pod \"cinder-scheduler-0\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.322641 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.324247 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.330140 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.339268 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.342838 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-config\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.342906 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw89j\" (UniqueName: \"kubernetes.io/projected/79e70d5f-7ec6-4c79-9044-b9495bf01054-kube-api-access-tw89j\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.342927 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.343057 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.343076 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.343829 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-config\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.343876 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.343929 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.344186 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.363734 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw89j\" (UniqueName: \"kubernetes.io/projected/79e70d5f-7ec6-4c79-9044-b9495bf01054-kube-api-access-tw89j\") pod \"dnsmasq-dns-6d97fcdd8f-s6tcb\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.381808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.444441 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gsw9\" (UniqueName: \"kubernetes.io/projected/fc55e34a-fdd3-4709-9512-fa65179f8ef4-kube-api-access-6gsw9\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.445619 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-config-data-custom\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.445680 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc55e34a-fdd3-4709-9512-fa65179f8ef4-logs\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.445724 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-scripts\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.445758 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.445776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-config-data\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.445822 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc55e34a-fdd3-4709-9512-fa65179f8ef4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.486164 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.547233 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc55e34a-fdd3-4709-9512-fa65179f8ef4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.547321 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gsw9\" (UniqueName: \"kubernetes.io/projected/fc55e34a-fdd3-4709-9512-fa65179f8ef4-kube-api-access-6gsw9\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.547347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-config-data-custom\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.547368 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc55e34a-fdd3-4709-9512-fa65179f8ef4-logs\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.547398 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-scripts\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.547434 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.547482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-config-data\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.548016 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc55e34a-fdd3-4709-9512-fa65179f8ef4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.548926 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc55e34a-fdd3-4709-9512-fa65179f8ef4-logs\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.554441 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-config-data-custom\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.554662 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-config-data\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.554554 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-scripts\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.561205 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.564339 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gsw9\" (UniqueName: \"kubernetes.io/projected/fc55e34a-fdd3-4709-9512-fa65179f8ef4-kube-api-access-6gsw9\") pod \"cinder-api-0\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.663156 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.840858 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 16:20:00 crc kubenswrapper[4764]: I1001 16:20:00.967390 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-s6tcb"] Oct 01 16:20:00 crc kubenswrapper[4764]: W1001 16:20:00.967875 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79e70d5f_7ec6_4c79_9044_b9495bf01054.slice/crio-129617a6acb65f61aadc69d0beae5e98a04cc95236f111a3accf8fc97b688277 WatchSource:0}: Error finding container 129617a6acb65f61aadc69d0beae5e98a04cc95236f111a3accf8fc97b688277: Status 404 returned error can't find the container with id 129617a6acb65f61aadc69d0beae5e98a04cc95236f111a3accf8fc97b688277 Oct 01 16:20:01 crc kubenswrapper[4764]: I1001 16:20:01.116893 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 16:20:01 crc kubenswrapper[4764]: W1001 16:20:01.118095 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc55e34a_fdd3_4709_9512_fa65179f8ef4.slice/crio-2572b00ce06e65cb236ff3dba6523e028d7352a4126d78060d1061493bf47d5c WatchSource:0}: Error finding container 2572b00ce06e65cb236ff3dba6523e028d7352a4126d78060d1061493bf47d5c: Status 404 returned error can't find the container with id 2572b00ce06e65cb236ff3dba6523e028d7352a4126d78060d1061493bf47d5c Oct 01 16:20:01 crc kubenswrapper[4764]: I1001 16:20:01.834201 4764 generic.go:334] "Generic (PLEG): container finished" podID="79e70d5f-7ec6-4c79-9044-b9495bf01054" containerID="e5b5aef8ea89e6859231b449e2f0cbd2ab6860125700b4c388300fe64eb06a46" exitCode=0 Oct 01 16:20:01 crc kubenswrapper[4764]: I1001 16:20:01.835014 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" event={"ID":"79e70d5f-7ec6-4c79-9044-b9495bf01054","Type":"ContainerDied","Data":"e5b5aef8ea89e6859231b449e2f0cbd2ab6860125700b4c388300fe64eb06a46"} Oct 01 16:20:01 crc kubenswrapper[4764]: I1001 16:20:01.835119 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" event={"ID":"79e70d5f-7ec6-4c79-9044-b9495bf01054","Type":"ContainerStarted","Data":"129617a6acb65f61aadc69d0beae5e98a04cc95236f111a3accf8fc97b688277"} Oct 01 16:20:01 crc kubenswrapper[4764]: I1001 16:20:01.839540 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"98f03dca-8e46-4db3-a187-6d56924bd91e","Type":"ContainerStarted","Data":"21e01b4c5d607e3131a880e426a705c16d9e2b48d75fe6fb1461495ffee90907"} Oct 01 16:20:01 crc kubenswrapper[4764]: I1001 16:20:01.842385 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fc55e34a-fdd3-4709-9512-fa65179f8ef4","Type":"ContainerStarted","Data":"31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6"} Oct 01 16:20:01 crc kubenswrapper[4764]: I1001 16:20:01.842476 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fc55e34a-fdd3-4709-9512-fa65179f8ef4","Type":"ContainerStarted","Data":"2572b00ce06e65cb236ff3dba6523e028d7352a4126d78060d1061493bf47d5c"} Oct 01 16:20:02 crc kubenswrapper[4764]: I1001 16:20:02.214942 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 16:20:02 crc kubenswrapper[4764]: I1001 16:20:02.855496 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fc55e34a-fdd3-4709-9512-fa65179f8ef4","Type":"ContainerStarted","Data":"ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605"} Oct 01 16:20:02 crc kubenswrapper[4764]: I1001 16:20:02.855613 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fc55e34a-fdd3-4709-9512-fa65179f8ef4" containerName="cinder-api-log" containerID="cri-o://31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6" gracePeriod=30 Oct 01 16:20:02 crc kubenswrapper[4764]: I1001 16:20:02.855661 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 16:20:02 crc kubenswrapper[4764]: I1001 16:20:02.855662 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fc55e34a-fdd3-4709-9512-fa65179f8ef4" containerName="cinder-api" containerID="cri-o://ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605" gracePeriod=30 Oct 01 16:20:02 crc kubenswrapper[4764]: I1001 16:20:02.869427 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" event={"ID":"79e70d5f-7ec6-4c79-9044-b9495bf01054","Type":"ContainerStarted","Data":"09f1e536be9102decfd85029ab29aeb992b41c781c38bcee536801f7f40a461e"} Oct 01 16:20:02 crc kubenswrapper[4764]: I1001 16:20:02.869558 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:02 crc kubenswrapper[4764]: I1001 16:20:02.877262 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"98f03dca-8e46-4db3-a187-6d56924bd91e","Type":"ContainerStarted","Data":"72090cf620909bce4a8f871a6e462f83af1f79332ec40949e7492f53c4644391"} Oct 01 16:20:02 crc kubenswrapper[4764]: I1001 16:20:02.884300 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.8842815010000002 podStartE2EDuration="2.884281501s" podCreationTimestamp="2025-10-01 16:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:20:02.879914134 +0000 UTC m=+1065.879561009" watchObservedRunningTime="2025-10-01 16:20:02.884281501 +0000 UTC m=+1065.883928336" Oct 01 16:20:02 crc kubenswrapper[4764]: I1001 16:20:02.914144 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" podStartSLOduration=2.914120537 podStartE2EDuration="2.914120537s" podCreationTimestamp="2025-10-01 16:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:20:02.907783101 +0000 UTC m=+1065.907429936" watchObservedRunningTime="2025-10-01 16:20:02.914120537 +0000 UTC m=+1065.913767382" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.500921 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.608418 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-config-data\") pod \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.608524 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gsw9\" (UniqueName: \"kubernetes.io/projected/fc55e34a-fdd3-4709-9512-fa65179f8ef4-kube-api-access-6gsw9\") pod \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.608560 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-scripts\") pod \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.608610 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-config-data-custom\") pod \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.608654 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc55e34a-fdd3-4709-9512-fa65179f8ef4-etc-machine-id\") pod \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.608679 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc55e34a-fdd3-4709-9512-fa65179f8ef4-logs\") pod \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.608745 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc55e34a-fdd3-4709-9512-fa65179f8ef4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fc55e34a-fdd3-4709-9512-fa65179f8ef4" (UID: "fc55e34a-fdd3-4709-9512-fa65179f8ef4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.609136 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc55e34a-fdd3-4709-9512-fa65179f8ef4-logs" (OuterVolumeSpecName: "logs") pod "fc55e34a-fdd3-4709-9512-fa65179f8ef4" (UID: "fc55e34a-fdd3-4709-9512-fa65179f8ef4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.609216 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-combined-ca-bundle\") pod \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\" (UID: \"fc55e34a-fdd3-4709-9512-fa65179f8ef4\") " Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.609798 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc55e34a-fdd3-4709-9512-fa65179f8ef4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.609819 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc55e34a-fdd3-4709-9512-fa65179f8ef4-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.624162 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fc55e34a-fdd3-4709-9512-fa65179f8ef4" (UID: "fc55e34a-fdd3-4709-9512-fa65179f8ef4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.624709 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc55e34a-fdd3-4709-9512-fa65179f8ef4-kube-api-access-6gsw9" (OuterVolumeSpecName: "kube-api-access-6gsw9") pod "fc55e34a-fdd3-4709-9512-fa65179f8ef4" (UID: "fc55e34a-fdd3-4709-9512-fa65179f8ef4"). InnerVolumeSpecName "kube-api-access-6gsw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.637096 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-scripts" (OuterVolumeSpecName: "scripts") pod "fc55e34a-fdd3-4709-9512-fa65179f8ef4" (UID: "fc55e34a-fdd3-4709-9512-fa65179f8ef4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.641163 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc55e34a-fdd3-4709-9512-fa65179f8ef4" (UID: "fc55e34a-fdd3-4709-9512-fa65179f8ef4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.665594 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-config-data" (OuterVolumeSpecName: "config-data") pod "fc55e34a-fdd3-4709-9512-fa65179f8ef4" (UID: "fc55e34a-fdd3-4709-9512-fa65179f8ef4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.711401 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.711429 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.711439 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.711447 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc55e34a-fdd3-4709-9512-fa65179f8ef4-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.711456 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gsw9\" (UniqueName: \"kubernetes.io/projected/fc55e34a-fdd3-4709-9512-fa65179f8ef4-kube-api-access-6gsw9\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.907855 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"98f03dca-8e46-4db3-a187-6d56924bd91e","Type":"ContainerStarted","Data":"e0f70a8657d5cba92bb84ce28f3c315d571b6e42f734c523dbcdedf1be36ca0c"} Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.911581 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc55e34a-fdd3-4709-9512-fa65179f8ef4" containerID="ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605" exitCode=0 Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.911613 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc55e34a-fdd3-4709-9512-fa65179f8ef4" containerID="31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6" exitCode=143 Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.912299 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.912914 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fc55e34a-fdd3-4709-9512-fa65179f8ef4","Type":"ContainerDied","Data":"ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605"} Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.912939 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fc55e34a-fdd3-4709-9512-fa65179f8ef4","Type":"ContainerDied","Data":"31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6"} Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.912956 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fc55e34a-fdd3-4709-9512-fa65179f8ef4","Type":"ContainerDied","Data":"2572b00ce06e65cb236ff3dba6523e028d7352a4126d78060d1061493bf47d5c"} Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.912980 4764 scope.go:117] "RemoveContainer" containerID="ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.955161 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.015740595 podStartE2EDuration="3.955139399s" podCreationTimestamp="2025-10-01 16:20:00 +0000 UTC" firstStartedPulling="2025-10-01 16:20:00.846881831 +0000 UTC m=+1063.846528666" lastFinishedPulling="2025-10-01 16:20:01.786280645 +0000 UTC m=+1064.785927470" observedRunningTime="2025-10-01 16:20:03.950880814 +0000 UTC m=+1066.950527649" watchObservedRunningTime="2025-10-01 16:20:03.955139399 +0000 UTC m=+1066.954786234" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.969461 4764 scope.go:117] "RemoveContainer" containerID="31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.975360 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.986730 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.993548 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 16:20:03 crc kubenswrapper[4764]: E1001 16:20:03.993969 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc55e34a-fdd3-4709-9512-fa65179f8ef4" containerName="cinder-api-log" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.993993 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc55e34a-fdd3-4709-9512-fa65179f8ef4" containerName="cinder-api-log" Oct 01 16:20:03 crc kubenswrapper[4764]: E1001 16:20:03.994016 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc55e34a-fdd3-4709-9512-fa65179f8ef4" containerName="cinder-api" Oct 01 16:20:03 crc kubenswrapper[4764]: I1001 16:20:03.994026 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc55e34a-fdd3-4709-9512-fa65179f8ef4" containerName="cinder-api" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.000360 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc55e34a-fdd3-4709-9512-fa65179f8ef4" containerName="cinder-api" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.000395 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc55e34a-fdd3-4709-9512-fa65179f8ef4" containerName="cinder-api-log" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.001663 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.006901 4764 scope.go:117] "RemoveContainer" containerID="ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605" Oct 01 16:20:04 crc kubenswrapper[4764]: E1001 16:20:04.007962 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605\": container with ID starting with ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605 not found: ID does not exist" containerID="ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.008007 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605"} err="failed to get container status \"ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605\": rpc error: code = NotFound desc = could not find container \"ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605\": container with ID starting with ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605 not found: ID does not exist" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.008032 4764 scope.go:117] "RemoveContainer" containerID="31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.009952 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.010711 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.010859 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 16:20:04 crc kubenswrapper[4764]: E1001 16:20:04.013209 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6\": container with ID starting with 31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6 not found: ID does not exist" containerID="31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.013270 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6"} err="failed to get container status \"31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6\": rpc error: code = NotFound desc = could not find container \"31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6\": container with ID starting with 31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6 not found: ID does not exist" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.013306 4764 scope.go:117] "RemoveContainer" containerID="ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.014312 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605"} err="failed to get container status \"ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605\": rpc error: code = NotFound desc = could not find container \"ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605\": container with ID starting with ec3db5b2796f48ea98ac695e65aab1df047f2f3db44b3283ba594f79f6040605 not found: ID does not exist" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.014340 4764 scope.go:117] "RemoveContainer" containerID="31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.014641 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6"} err="failed to get container status \"31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6\": rpc error: code = NotFound desc = could not find container \"31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6\": container with ID starting with 31ccb52102d0e4d7bc104d89c7ecd763cf667d04dfa109435c2029ace32846b6 not found: ID does not exist" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.022311 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.128690 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55dh4\" (UniqueName: \"kubernetes.io/projected/7934a61c-2af3-4c51-987b-411ee1c7645f-kube-api-access-55dh4\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.128763 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.128814 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-scripts\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.128904 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.128934 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-config-data\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.128964 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.129008 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7934a61c-2af3-4c51-987b-411ee1c7645f-logs\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.129062 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7934a61c-2af3-4c51-987b-411ee1c7645f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.129100 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-config-data-custom\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.230625 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.231069 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-scripts\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.231187 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.231268 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-config-data\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.231349 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.231448 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7934a61c-2af3-4c51-987b-411ee1c7645f-logs\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.231531 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7934a61c-2af3-4c51-987b-411ee1c7645f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.231608 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-config-data-custom\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.231707 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55dh4\" (UniqueName: \"kubernetes.io/projected/7934a61c-2af3-4c51-987b-411ee1c7645f-kube-api-access-55dh4\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.231606 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7934a61c-2af3-4c51-987b-411ee1c7645f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.231910 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7934a61c-2af3-4c51-987b-411ee1c7645f-logs\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.236483 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-config-data-custom\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.237642 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.237684 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-config-data\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.238509 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.239790 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.240785 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7934a61c-2af3-4c51-987b-411ee1c7645f-scripts\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.258108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55dh4\" (UniqueName: \"kubernetes.io/projected/7934a61c-2af3-4c51-987b-411ee1c7645f-kube-api-access-55dh4\") pod \"cinder-api-0\" (UID: \"7934a61c-2af3-4c51-987b-411ee1c7645f\") " pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.333963 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.826321 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 16:20:04 crc kubenswrapper[4764]: I1001 16:20:04.929662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7934a61c-2af3-4c51-987b-411ee1c7645f","Type":"ContainerStarted","Data":"b964e525277c12a0cc413c408a91cd51c01058429acd958cbf051d84976cb94c"} Oct 01 16:20:05 crc kubenswrapper[4764]: I1001 16:20:05.382633 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 16:20:05 crc kubenswrapper[4764]: I1001 16:20:05.742592 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc55e34a-fdd3-4709-9512-fa65179f8ef4" path="/var/lib/kubelet/pods/fc55e34a-fdd3-4709-9512-fa65179f8ef4/volumes" Oct 01 16:20:05 crc kubenswrapper[4764]: I1001 16:20:05.943083 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7934a61c-2af3-4c51-987b-411ee1c7645f","Type":"ContainerStarted","Data":"6bdb4f96d9232c7b9c404d4b20218d965bd0cba7d11c15bd7358ab5e6de78d81"} Oct 01 16:20:06 crc kubenswrapper[4764]: I1001 16:20:06.971152 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7934a61c-2af3-4c51-987b-411ee1c7645f","Type":"ContainerStarted","Data":"e425ce7bc6b2b84703c0223d14cbbd978f885ab69d5998cf77965ea5e568edd9"} Oct 01 16:20:06 crc kubenswrapper[4764]: I1001 16:20:06.971700 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 16:20:07 crc kubenswrapper[4764]: I1001 16:20:07.011990 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.011960077 podStartE2EDuration="4.011960077s" podCreationTimestamp="2025-10-01 16:20:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:20:07.004294189 +0000 UTC m=+1070.003941044" watchObservedRunningTime="2025-10-01 16:20:07.011960077 +0000 UTC m=+1070.011606962" Oct 01 16:20:10 crc kubenswrapper[4764]: I1001 16:20:10.488836 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:20:10 crc kubenswrapper[4764]: I1001 16:20:10.551931 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-9tq7b"] Oct 01 16:20:10 crc kubenswrapper[4764]: I1001 16:20:10.553936 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" podUID="5128f87a-55a2-419c-aa4d-3b79288c8910" containerName="dnsmasq-dns" containerID="cri-o://4711db450b745531ad6b2458236fdd08c7c1146b120434cdae64a8b81ced7236" gracePeriod=10 Oct 01 16:20:10 crc kubenswrapper[4764]: I1001 16:20:10.574234 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:20:10 crc kubenswrapper[4764]: I1001 16:20:10.574922 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8695dd9c7b-mwdsh" Oct 01 16:20:10 crc kubenswrapper[4764]: I1001 16:20:10.706535 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 16:20:10 crc kubenswrapper[4764]: I1001 16:20:10.742700 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.015244 4764 generic.go:334] "Generic (PLEG): container finished" podID="5128f87a-55a2-419c-aa4d-3b79288c8910" containerID="4711db450b745531ad6b2458236fdd08c7c1146b120434cdae64a8b81ced7236" exitCode=0 Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.015340 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" event={"ID":"5128f87a-55a2-419c-aa4d-3b79288c8910","Type":"ContainerDied","Data":"4711db450b745531ad6b2458236fdd08c7c1146b120434cdae64a8b81ced7236"} Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.015383 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" event={"ID":"5128f87a-55a2-419c-aa4d-3b79288c8910","Type":"ContainerDied","Data":"ab3fbf1c31e3e8cc0a1d13e6c6ea37d318693a57807056c91aab8ccc64c36328"} Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.015398 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab3fbf1c31e3e8cc0a1d13e6c6ea37d318693a57807056c91aab8ccc64c36328" Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.015692 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="98f03dca-8e46-4db3-a187-6d56924bd91e" containerName="cinder-scheduler" containerID="cri-o://72090cf620909bce4a8f871a6e462f83af1f79332ec40949e7492f53c4644391" gracePeriod=30 Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.015830 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="98f03dca-8e46-4db3-a187-6d56924bd91e" containerName="probe" containerID="cri-o://e0f70a8657d5cba92bb84ce28f3c315d571b6e42f734c523dbcdedf1be36ca0c" gracePeriod=30 Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.032193 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.096097 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-78f7fcb65-9gxk4" Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.183221 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsfqh\" (UniqueName: \"kubernetes.io/projected/5128f87a-55a2-419c-aa4d-3b79288c8910-kube-api-access-tsfqh\") pod \"5128f87a-55a2-419c-aa4d-3b79288c8910\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.183351 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-dns-svc\") pod \"5128f87a-55a2-419c-aa4d-3b79288c8910\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.183389 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-ovsdbserver-nb\") pod \"5128f87a-55a2-419c-aa4d-3b79288c8910\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.183482 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-ovsdbserver-sb\") pod \"5128f87a-55a2-419c-aa4d-3b79288c8910\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.183519 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-config\") pod \"5128f87a-55a2-419c-aa4d-3b79288c8910\" (UID: \"5128f87a-55a2-419c-aa4d-3b79288c8910\") " Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.204763 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5128f87a-55a2-419c-aa4d-3b79288c8910-kube-api-access-tsfqh" (OuterVolumeSpecName: "kube-api-access-tsfqh") pod "5128f87a-55a2-419c-aa4d-3b79288c8910" (UID: "5128f87a-55a2-419c-aa4d-3b79288c8910"). InnerVolumeSpecName "kube-api-access-tsfqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.233715 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-config" (OuterVolumeSpecName: "config") pod "5128f87a-55a2-419c-aa4d-3b79288c8910" (UID: "5128f87a-55a2-419c-aa4d-3b79288c8910"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.235083 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5128f87a-55a2-419c-aa4d-3b79288c8910" (UID: "5128f87a-55a2-419c-aa4d-3b79288c8910"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.236194 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5128f87a-55a2-419c-aa4d-3b79288c8910" (UID: "5128f87a-55a2-419c-aa4d-3b79288c8910"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.248175 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5128f87a-55a2-419c-aa4d-3b79288c8910" (UID: "5128f87a-55a2-419c-aa4d-3b79288c8910"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.317768 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.317808 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.317824 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsfqh\" (UniqueName: \"kubernetes.io/projected/5128f87a-55a2-419c-aa4d-3b79288c8910-kube-api-access-tsfqh\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.317836 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:11 crc kubenswrapper[4764]: I1001 16:20:11.317846 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5128f87a-55a2-419c-aa4d-3b79288c8910-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:12 crc kubenswrapper[4764]: I1001 16:20:12.027099 4764 generic.go:334] "Generic (PLEG): container finished" podID="98f03dca-8e46-4db3-a187-6d56924bd91e" containerID="e0f70a8657d5cba92bb84ce28f3c315d571b6e42f734c523dbcdedf1be36ca0c" exitCode=0 Oct 01 16:20:12 crc kubenswrapper[4764]: I1001 16:20:12.027405 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-9tq7b" Oct 01 16:20:12 crc kubenswrapper[4764]: I1001 16:20:12.027171 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"98f03dca-8e46-4db3-a187-6d56924bd91e","Type":"ContainerDied","Data":"e0f70a8657d5cba92bb84ce28f3c315d571b6e42f734c523dbcdedf1be36ca0c"} Oct 01 16:20:12 crc kubenswrapper[4764]: I1001 16:20:12.051605 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-9tq7b"] Oct 01 16:20:12 crc kubenswrapper[4764]: I1001 16:20:12.061661 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-9tq7b"] Oct 01 16:20:13 crc kubenswrapper[4764]: I1001 16:20:13.734380 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5128f87a-55a2-419c-aa4d-3b79288c8910" path="/var/lib/kubelet/pods/5128f87a-55a2-419c-aa4d-3b79288c8910/volumes" Oct 01 16:20:13 crc kubenswrapper[4764]: I1001 16:20:13.832153 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 16:20:13 crc kubenswrapper[4764]: I1001 16:20:13.963738 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-config-data-custom\") pod \"98f03dca-8e46-4db3-a187-6d56924bd91e\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " Oct 01 16:20:13 crc kubenswrapper[4764]: I1001 16:20:13.964072 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfngh\" (UniqueName: \"kubernetes.io/projected/98f03dca-8e46-4db3-a187-6d56924bd91e-kube-api-access-wfngh\") pod \"98f03dca-8e46-4db3-a187-6d56924bd91e\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " Oct 01 16:20:13 crc kubenswrapper[4764]: I1001 16:20:13.964171 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-config-data\") pod \"98f03dca-8e46-4db3-a187-6d56924bd91e\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " Oct 01 16:20:13 crc kubenswrapper[4764]: I1001 16:20:13.964282 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-combined-ca-bundle\") pod \"98f03dca-8e46-4db3-a187-6d56924bd91e\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " Oct 01 16:20:13 crc kubenswrapper[4764]: I1001 16:20:13.964424 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98f03dca-8e46-4db3-a187-6d56924bd91e-etc-machine-id\") pod \"98f03dca-8e46-4db3-a187-6d56924bd91e\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " Oct 01 16:20:13 crc kubenswrapper[4764]: I1001 16:20:13.964508 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-scripts\") pod \"98f03dca-8e46-4db3-a187-6d56924bd91e\" (UID: \"98f03dca-8e46-4db3-a187-6d56924bd91e\") " Oct 01 16:20:13 crc kubenswrapper[4764]: I1001 16:20:13.964459 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98f03dca-8e46-4db3-a187-6d56924bd91e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "98f03dca-8e46-4db3-a187-6d56924bd91e" (UID: "98f03dca-8e46-4db3-a187-6d56924bd91e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:20:13 crc kubenswrapper[4764]: I1001 16:20:13.964950 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/98f03dca-8e46-4db3-a187-6d56924bd91e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:13 crc kubenswrapper[4764]: I1001 16:20:13.972168 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "98f03dca-8e46-4db3-a187-6d56924bd91e" (UID: "98f03dca-8e46-4db3-a187-6d56924bd91e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:13 crc kubenswrapper[4764]: I1001 16:20:13.972275 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f03dca-8e46-4db3-a187-6d56924bd91e-kube-api-access-wfngh" (OuterVolumeSpecName: "kube-api-access-wfngh") pod "98f03dca-8e46-4db3-a187-6d56924bd91e" (UID: "98f03dca-8e46-4db3-a187-6d56924bd91e"). InnerVolumeSpecName "kube-api-access-wfngh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:20:13 crc kubenswrapper[4764]: I1001 16:20:13.984256 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-scripts" (OuterVolumeSpecName: "scripts") pod "98f03dca-8e46-4db3-a187-6d56924bd91e" (UID: "98f03dca-8e46-4db3-a187-6d56924bd91e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.024651 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98f03dca-8e46-4db3-a187-6d56924bd91e" (UID: "98f03dca-8e46-4db3-a187-6d56924bd91e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.049326 4764 generic.go:334] "Generic (PLEG): container finished" podID="98f03dca-8e46-4db3-a187-6d56924bd91e" containerID="72090cf620909bce4a8f871a6e462f83af1f79332ec40949e7492f53c4644391" exitCode=0 Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.049365 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"98f03dca-8e46-4db3-a187-6d56924bd91e","Type":"ContainerDied","Data":"72090cf620909bce4a8f871a6e462f83af1f79332ec40949e7492f53c4644391"} Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.049390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"98f03dca-8e46-4db3-a187-6d56924bd91e","Type":"ContainerDied","Data":"21e01b4c5d607e3131a880e426a705c16d9e2b48d75fe6fb1461495ffee90907"} Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.049408 4764 scope.go:117] "RemoveContainer" containerID="e0f70a8657d5cba92bb84ce28f3c315d571b6e42f734c523dbcdedf1be36ca0c" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.049475 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.077130 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.077166 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfngh\" (UniqueName: \"kubernetes.io/projected/98f03dca-8e46-4db3-a187-6d56924bd91e-kube-api-access-wfngh\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.077184 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.077196 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.092423 4764 scope.go:117] "RemoveContainer" containerID="72090cf620909bce4a8f871a6e462f83af1f79332ec40949e7492f53c4644391" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.108340 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-config-data" (OuterVolumeSpecName: "config-data") pod "98f03dca-8e46-4db3-a187-6d56924bd91e" (UID: "98f03dca-8e46-4db3-a187-6d56924bd91e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.149620 4764 scope.go:117] "RemoveContainer" containerID="e0f70a8657d5cba92bb84ce28f3c315d571b6e42f734c523dbcdedf1be36ca0c" Oct 01 16:20:14 crc kubenswrapper[4764]: E1001 16:20:14.150161 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f70a8657d5cba92bb84ce28f3c315d571b6e42f734c523dbcdedf1be36ca0c\": container with ID starting with e0f70a8657d5cba92bb84ce28f3c315d571b6e42f734c523dbcdedf1be36ca0c not found: ID does not exist" containerID="e0f70a8657d5cba92bb84ce28f3c315d571b6e42f734c523dbcdedf1be36ca0c" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.150232 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f70a8657d5cba92bb84ce28f3c315d571b6e42f734c523dbcdedf1be36ca0c"} err="failed to get container status \"e0f70a8657d5cba92bb84ce28f3c315d571b6e42f734c523dbcdedf1be36ca0c\": rpc error: code = NotFound desc = could not find container \"e0f70a8657d5cba92bb84ce28f3c315d571b6e42f734c523dbcdedf1be36ca0c\": container with ID starting with e0f70a8657d5cba92bb84ce28f3c315d571b6e42f734c523dbcdedf1be36ca0c not found: ID does not exist" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.150271 4764 scope.go:117] "RemoveContainer" containerID="72090cf620909bce4a8f871a6e462f83af1f79332ec40949e7492f53c4644391" Oct 01 16:20:14 crc kubenswrapper[4764]: E1001 16:20:14.150893 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72090cf620909bce4a8f871a6e462f83af1f79332ec40949e7492f53c4644391\": container with ID starting with 72090cf620909bce4a8f871a6e462f83af1f79332ec40949e7492f53c4644391 not found: ID does not exist" containerID="72090cf620909bce4a8f871a6e462f83af1f79332ec40949e7492f53c4644391" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.150948 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72090cf620909bce4a8f871a6e462f83af1f79332ec40949e7492f53c4644391"} err="failed to get container status \"72090cf620909bce4a8f871a6e462f83af1f79332ec40949e7492f53c4644391\": rpc error: code = NotFound desc = could not find container \"72090cf620909bce4a8f871a6e462f83af1f79332ec40949e7492f53c4644391\": container with ID starting with 72090cf620909bce4a8f871a6e462f83af1f79332ec40949e7492f53c4644391 not found: ID does not exist" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.154174 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 01 16:20:14 crc kubenswrapper[4764]: E1001 16:20:14.154519 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f03dca-8e46-4db3-a187-6d56924bd91e" containerName="cinder-scheduler" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.154537 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f03dca-8e46-4db3-a187-6d56924bd91e" containerName="cinder-scheduler" Oct 01 16:20:14 crc kubenswrapper[4764]: E1001 16:20:14.154586 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f03dca-8e46-4db3-a187-6d56924bd91e" containerName="probe" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.154595 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f03dca-8e46-4db3-a187-6d56924bd91e" containerName="probe" Oct 01 16:20:14 crc kubenswrapper[4764]: E1001 16:20:14.154605 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5128f87a-55a2-419c-aa4d-3b79288c8910" containerName="init" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.154612 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5128f87a-55a2-419c-aa4d-3b79288c8910" containerName="init" Oct 01 16:20:14 crc kubenswrapper[4764]: E1001 16:20:14.154624 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5128f87a-55a2-419c-aa4d-3b79288c8910" containerName="dnsmasq-dns" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.154632 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5128f87a-55a2-419c-aa4d-3b79288c8910" containerName="dnsmasq-dns" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.154855 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f03dca-8e46-4db3-a187-6d56924bd91e" containerName="cinder-scheduler" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.154888 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5128f87a-55a2-419c-aa4d-3b79288c8910" containerName="dnsmasq-dns" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.154916 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f03dca-8e46-4db3-a187-6d56924bd91e" containerName="probe" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.160138 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.164506 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.174808 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.174857 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.175399 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-67727" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.191692 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f03dca-8e46-4db3-a187-6d56924bd91e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.293558 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d91465c-097e-4579-a5de-df0547d06dbf-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d91465c-097e-4579-a5de-df0547d06dbf\") " pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.293627 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d91465c-097e-4579-a5de-df0547d06dbf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2d91465c-097e-4579-a5de-df0547d06dbf\") " pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.293718 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv4kh\" (UniqueName: \"kubernetes.io/projected/2d91465c-097e-4579-a5de-df0547d06dbf-kube-api-access-vv4kh\") pod \"openstackclient\" (UID: \"2d91465c-097e-4579-a5de-df0547d06dbf\") " pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.293767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d91465c-097e-4579-a5de-df0547d06dbf-openstack-config\") pod \"openstackclient\" (UID: \"2d91465c-097e-4579-a5de-df0547d06dbf\") " pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.390034 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.395644 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d91465c-097e-4579-a5de-df0547d06dbf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2d91465c-097e-4579-a5de-df0547d06dbf\") " pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.395766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv4kh\" (UniqueName: \"kubernetes.io/projected/2d91465c-097e-4579-a5de-df0547d06dbf-kube-api-access-vv4kh\") pod \"openstackclient\" (UID: \"2d91465c-097e-4579-a5de-df0547d06dbf\") " pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.395820 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d91465c-097e-4579-a5de-df0547d06dbf-openstack-config\") pod \"openstackclient\" (UID: \"2d91465c-097e-4579-a5de-df0547d06dbf\") " pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.395903 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d91465c-097e-4579-a5de-df0547d06dbf-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d91465c-097e-4579-a5de-df0547d06dbf\") " pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.396859 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2d91465c-097e-4579-a5de-df0547d06dbf-openstack-config\") pod \"openstackclient\" (UID: \"2d91465c-097e-4579-a5de-df0547d06dbf\") " pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.397617 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.399299 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2d91465c-097e-4579-a5de-df0547d06dbf-openstack-config-secret\") pod \"openstackclient\" (UID: \"2d91465c-097e-4579-a5de-df0547d06dbf\") " pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.401596 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d91465c-097e-4579-a5de-df0547d06dbf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"2d91465c-097e-4579-a5de-df0547d06dbf\") " pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.410252 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.411846 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.418431 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.418493 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv4kh\" (UniqueName: \"kubernetes.io/projected/2d91465c-097e-4579-a5de-df0547d06dbf-kube-api-access-vv4kh\") pod \"openstackclient\" (UID: \"2d91465c-097e-4579-a5de-df0547d06dbf\") " pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.430999 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.504134 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.598558 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nm79\" (UniqueName: \"kubernetes.io/projected/cb0e912d-791f-436a-9e94-1e60281b6654-kube-api-access-6nm79\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.598818 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb0e912d-791f-436a-9e94-1e60281b6654-scripts\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.598837 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb0e912d-791f-436a-9e94-1e60281b6654-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.598899 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0e912d-791f-436a-9e94-1e60281b6654-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.598949 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb0e912d-791f-436a-9e94-1e60281b6654-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.598982 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0e912d-791f-436a-9e94-1e60281b6654-config-data\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.702338 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0e912d-791f-436a-9e94-1e60281b6654-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.702429 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb0e912d-791f-436a-9e94-1e60281b6654-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.702465 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0e912d-791f-436a-9e94-1e60281b6654-config-data\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.702503 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nm79\" (UniqueName: \"kubernetes.io/projected/cb0e912d-791f-436a-9e94-1e60281b6654-kube-api-access-6nm79\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.702522 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb0e912d-791f-436a-9e94-1e60281b6654-scripts\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.702537 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb0e912d-791f-436a-9e94-1e60281b6654-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.707844 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb0e912d-791f-436a-9e94-1e60281b6654-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.707971 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0e912d-791f-436a-9e94-1e60281b6654-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.708356 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb0e912d-791f-436a-9e94-1e60281b6654-config-data\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.708659 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb0e912d-791f-436a-9e94-1e60281b6654-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.715383 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb0e912d-791f-436a-9e94-1e60281b6654-scripts\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.723524 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nm79\" (UniqueName: \"kubernetes.io/projected/cb0e912d-791f-436a-9e94-1e60281b6654-kube-api-access-6nm79\") pod \"cinder-scheduler-0\" (UID: \"cb0e912d-791f-436a-9e94-1e60281b6654\") " pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.780151 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 16:20:14 crc kubenswrapper[4764]: I1001 16:20:14.959030 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 16:20:14 crc kubenswrapper[4764]: W1001 16:20:14.974586 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d91465c_097e_4579_a5de_df0547d06dbf.slice/crio-2aecb9cf2c3f7b1edf4269e2e8653bd047c84494d35d5ee5dcb3dc3119c3ff4c WatchSource:0}: Error finding container 2aecb9cf2c3f7b1edf4269e2e8653bd047c84494d35d5ee5dcb3dc3119c3ff4c: Status 404 returned error can't find the container with id 2aecb9cf2c3f7b1edf4269e2e8653bd047c84494d35d5ee5dcb3dc3119c3ff4c Oct 01 16:20:15 crc kubenswrapper[4764]: I1001 16:20:15.065936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2d91465c-097e-4579-a5de-df0547d06dbf","Type":"ContainerStarted","Data":"2aecb9cf2c3f7b1edf4269e2e8653bd047c84494d35d5ee5dcb3dc3119c3ff4c"} Oct 01 16:20:15 crc kubenswrapper[4764]: I1001 16:20:15.276275 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 16:20:15 crc kubenswrapper[4764]: W1001 16:20:15.281712 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb0e912d_791f_436a_9e94_1e60281b6654.slice/crio-1b141c025e89e862dbc371dfdbe27f4c4026b0419cdebd0d1eef88d033e0b798 WatchSource:0}: Error finding container 1b141c025e89e862dbc371dfdbe27f4c4026b0419cdebd0d1eef88d033e0b798: Status 404 returned error can't find the container with id 1b141c025e89e862dbc371dfdbe27f4c4026b0419cdebd0d1eef88d033e0b798 Oct 01 16:20:15 crc kubenswrapper[4764]: I1001 16:20:15.731348 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f03dca-8e46-4db3-a187-6d56924bd91e" path="/var/lib/kubelet/pods/98f03dca-8e46-4db3-a187-6d56924bd91e/volumes" Oct 01 16:20:16 crc kubenswrapper[4764]: I1001 16:20:16.078499 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb0e912d-791f-436a-9e94-1e60281b6654","Type":"ContainerStarted","Data":"2dd3c5791a9970a371f460f646a108df04f0b9d31ef6193ef162b248026224f8"} Oct 01 16:20:16 crc kubenswrapper[4764]: I1001 16:20:16.078798 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb0e912d-791f-436a-9e94-1e60281b6654","Type":"ContainerStarted","Data":"1b141c025e89e862dbc371dfdbe27f4c4026b0419cdebd0d1eef88d033e0b798"} Oct 01 16:20:16 crc kubenswrapper[4764]: I1001 16:20:16.564876 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 01 16:20:17 crc kubenswrapper[4764]: I1001 16:20:17.093798 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb0e912d-791f-436a-9e94-1e60281b6654","Type":"ContainerStarted","Data":"d506003451ffb30172f26a3baccc72ec1f8752d49955da7c4bb04c2f8090577e"} Oct 01 16:20:17 crc kubenswrapper[4764]: I1001 16:20:17.112001 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.111982517 podStartE2EDuration="3.111982517s" podCreationTimestamp="2025-10-01 16:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:20:17.10927928 +0000 UTC m=+1080.108926115" watchObservedRunningTime="2025-10-01 16:20:17.111982517 +0000 UTC m=+1080.111629352" Oct 01 16:20:18 crc kubenswrapper[4764]: I1001 16:20:18.103549 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:20:19 crc kubenswrapper[4764]: I1001 16:20:19.781448 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 16:20:20 crc kubenswrapper[4764]: I1001 16:20:20.124791 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5d577ff6cf-5gk59" Oct 01 16:20:20 crc kubenswrapper[4764]: I1001 16:20:20.237939 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-777fd8fcb-q2t4n"] Oct 01 16:20:20 crc kubenswrapper[4764]: I1001 16:20:20.238182 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-777fd8fcb-q2t4n" podUID="a44c789b-d197-43fe-ad1e-72f8d0c70cca" containerName="neutron-api" containerID="cri-o://856168f7c342b4d5acde2b536aa2f0028b03075a3afa724412429c09504ec5ce" gracePeriod=30 Oct 01 16:20:20 crc kubenswrapper[4764]: I1001 16:20:20.238603 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-777fd8fcb-q2t4n" podUID="a44c789b-d197-43fe-ad1e-72f8d0c70cca" containerName="neutron-httpd" containerID="cri-o://01b5bd367bf8914e501d8c9c4c76458e12811b1d089c86cf579a2c7aacaa4490" gracePeriod=30 Oct 01 16:20:21 crc kubenswrapper[4764]: I1001 16:20:21.155749 4764 generic.go:334] "Generic (PLEG): container finished" podID="a44c789b-d197-43fe-ad1e-72f8d0c70cca" containerID="01b5bd367bf8914e501d8c9c4c76458e12811b1d089c86cf579a2c7aacaa4490" exitCode=0 Oct 01 16:20:21 crc kubenswrapper[4764]: I1001 16:20:21.155810 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-777fd8fcb-q2t4n" event={"ID":"a44c789b-d197-43fe-ad1e-72f8d0c70cca","Type":"ContainerDied","Data":"01b5bd367bf8914e501d8c9c4c76458e12811b1d089c86cf579a2c7aacaa4490"} Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.076825 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.155754 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-x59d8"] Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.156823 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x59d8" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.179800 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-x59d8"] Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.254655 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-t4ftm"] Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.255683 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t4ftm" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.263939 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t47tz\" (UniqueName: \"kubernetes.io/projected/7418243b-5584-4f53-bded-549e34e415ca-kube-api-access-t47tz\") pod \"nova-api-db-create-x59d8\" (UID: \"7418243b-5584-4f53-bded-549e34e415ca\") " pod="openstack/nova-api-db-create-x59d8" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.264867 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-t4ftm"] Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.358203 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-mz6gj"] Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.360490 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mz6gj" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.365105 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t47tz\" (UniqueName: \"kubernetes.io/projected/7418243b-5584-4f53-bded-549e34e415ca-kube-api-access-t47tz\") pod \"nova-api-db-create-x59d8\" (UID: \"7418243b-5584-4f53-bded-549e34e415ca\") " pod="openstack/nova-api-db-create-x59d8" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.365175 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zkgr\" (UniqueName: \"kubernetes.io/projected/8756a942-4b09-43eb-b2a5-30048c7fa903-kube-api-access-8zkgr\") pod \"nova-cell0-db-create-t4ftm\" (UID: \"8756a942-4b09-43eb-b2a5-30048c7fa903\") " pod="openstack/nova-cell0-db-create-t4ftm" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.372153 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mz6gj"] Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.391600 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t47tz\" (UniqueName: \"kubernetes.io/projected/7418243b-5584-4f53-bded-549e34e415ca-kube-api-access-t47tz\") pod \"nova-api-db-create-x59d8\" (UID: \"7418243b-5584-4f53-bded-549e34e415ca\") " pod="openstack/nova-api-db-create-x59d8" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.467121 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xxlk\" (UniqueName: \"kubernetes.io/projected/3867e107-7704-4f0f-acf6-356bd78d71af-kube-api-access-8xxlk\") pod \"nova-cell1-db-create-mz6gj\" (UID: \"3867e107-7704-4f0f-acf6-356bd78d71af\") " pod="openstack/nova-cell1-db-create-mz6gj" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.467467 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zkgr\" (UniqueName: \"kubernetes.io/projected/8756a942-4b09-43eb-b2a5-30048c7fa903-kube-api-access-8zkgr\") pod \"nova-cell0-db-create-t4ftm\" (UID: \"8756a942-4b09-43eb-b2a5-30048c7fa903\") " pod="openstack/nova-cell0-db-create-t4ftm" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.479918 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x59d8" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.490685 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zkgr\" (UniqueName: \"kubernetes.io/projected/8756a942-4b09-43eb-b2a5-30048c7fa903-kube-api-access-8zkgr\") pod \"nova-cell0-db-create-t4ftm\" (UID: \"8756a942-4b09-43eb-b2a5-30048c7fa903\") " pod="openstack/nova-cell0-db-create-t4ftm" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.568932 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xxlk\" (UniqueName: \"kubernetes.io/projected/3867e107-7704-4f0f-acf6-356bd78d71af-kube-api-access-8xxlk\") pod \"nova-cell1-db-create-mz6gj\" (UID: \"3867e107-7704-4f0f-acf6-356bd78d71af\") " pod="openstack/nova-cell1-db-create-mz6gj" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.571463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t4ftm" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.584999 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xxlk\" (UniqueName: \"kubernetes.io/projected/3867e107-7704-4f0f-acf6-356bd78d71af-kube-api-access-8xxlk\") pod \"nova-cell1-db-create-mz6gj\" (UID: \"3867e107-7704-4f0f-acf6-356bd78d71af\") " pod="openstack/nova-cell1-db-create-mz6gj" Oct 01 16:20:23 crc kubenswrapper[4764]: I1001 16:20:23.679116 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mz6gj" Oct 01 16:20:24 crc kubenswrapper[4764]: I1001 16:20:24.199934 4764 generic.go:334] "Generic (PLEG): container finished" podID="a44c789b-d197-43fe-ad1e-72f8d0c70cca" containerID="856168f7c342b4d5acde2b536aa2f0028b03075a3afa724412429c09504ec5ce" exitCode=0 Oct 01 16:20:24 crc kubenswrapper[4764]: I1001 16:20:24.200293 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-777fd8fcb-q2t4n" event={"ID":"a44c789b-d197-43fe-ad1e-72f8d0c70cca","Type":"ContainerDied","Data":"856168f7c342b4d5acde2b536aa2f0028b03075a3afa724412429c09504ec5ce"} Oct 01 16:20:24 crc kubenswrapper[4764]: I1001 16:20:24.773587 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-t4ftm"] Oct 01 16:20:24 crc kubenswrapper[4764]: I1001 16:20:24.987701 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-x59d8"] Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.003765 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mz6gj"] Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.073328 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.175311 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.224713 4764 generic.go:334] "Generic (PLEG): container finished" podID="8756a942-4b09-43eb-b2a5-30048c7fa903" containerID="0b746ef53fd2437209b8aeb04cf0ceba9b7b17165435e9b4ac88d57c68206de2" exitCode=0 Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.224773 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t4ftm" event={"ID":"8756a942-4b09-43eb-b2a5-30048c7fa903","Type":"ContainerDied","Data":"0b746ef53fd2437209b8aeb04cf0ceba9b7b17165435e9b4ac88d57c68206de2"} Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.224798 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t4ftm" event={"ID":"8756a942-4b09-43eb-b2a5-30048c7fa903","Type":"ContainerStarted","Data":"8b26ceab60da0f9348ded78478eb5a49b2f41b5ae0b8abb8ef00355d67c7a092"} Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.233525 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-777fd8fcb-q2t4n" event={"ID":"a44c789b-d197-43fe-ad1e-72f8d0c70cca","Type":"ContainerDied","Data":"a2e87f5585cbcd2a9d7159bbf0a18d40a9696b3c1759bef99806004e9cfb23ba"} Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.233579 4764 scope.go:117] "RemoveContainer" containerID="01b5bd367bf8914e501d8c9c4c76458e12811b1d089c86cf579a2c7aacaa4490" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.233699 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-777fd8fcb-q2t4n" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.235785 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x59d8" event={"ID":"7418243b-5584-4f53-bded-549e34e415ca","Type":"ContainerStarted","Data":"3d1353f09d54a4a3c3af273ddd4a5b52154e8f1ac86ba02d3e1c15c6d697eca6"} Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.243768 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mz6gj" event={"ID":"3867e107-7704-4f0f-acf6-356bd78d71af","Type":"ContainerStarted","Data":"97229ae29ec7696dfda9ccd76546c284b3888a74eb988727bda7f810f8226ef0"} Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.245640 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"2d91465c-097e-4579-a5de-df0547d06dbf","Type":"ContainerStarted","Data":"c5acd9905eea739c2c1dd84114bfbe000cfcd1175d2199eaff86fbcda63358bb"} Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.263893 4764 scope.go:117] "RemoveContainer" containerID="856168f7c342b4d5acde2b536aa2f0028b03075a3afa724412429c09504ec5ce" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.271290 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.957075356 podStartE2EDuration="11.271272899s" podCreationTimestamp="2025-10-01 16:20:14 +0000 UTC" firstStartedPulling="2025-10-01 16:20:14.975694116 +0000 UTC m=+1077.975340951" lastFinishedPulling="2025-10-01 16:20:24.289891659 +0000 UTC m=+1087.289538494" observedRunningTime="2025-10-01 16:20:25.270519491 +0000 UTC m=+1088.270166336" watchObservedRunningTime="2025-10-01 16:20:25.271272899 +0000 UTC m=+1088.270919734" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.318675 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-combined-ca-bundle\") pod \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.318781 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-config\") pod \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.318803 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-ovndb-tls-certs\") pod \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.318859 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-httpd-config\") pod \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.318996 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw6wr\" (UniqueName: \"kubernetes.io/projected/a44c789b-d197-43fe-ad1e-72f8d0c70cca-kube-api-access-sw6wr\") pod \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\" (UID: \"a44c789b-d197-43fe-ad1e-72f8d0c70cca\") " Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.327102 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a44c789b-d197-43fe-ad1e-72f8d0c70cca" (UID: "a44c789b-d197-43fe-ad1e-72f8d0c70cca"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.328345 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a44c789b-d197-43fe-ad1e-72f8d0c70cca-kube-api-access-sw6wr" (OuterVolumeSpecName: "kube-api-access-sw6wr") pod "a44c789b-d197-43fe-ad1e-72f8d0c70cca" (UID: "a44c789b-d197-43fe-ad1e-72f8d0c70cca"). InnerVolumeSpecName "kube-api-access-sw6wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.380717 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a44c789b-d197-43fe-ad1e-72f8d0c70cca" (UID: "a44c789b-d197-43fe-ad1e-72f8d0c70cca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.400024 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-config" (OuterVolumeSpecName: "config") pod "a44c789b-d197-43fe-ad1e-72f8d0c70cca" (UID: "a44c789b-d197-43fe-ad1e-72f8d0c70cca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.414151 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a44c789b-d197-43fe-ad1e-72f8d0c70cca" (UID: "a44c789b-d197-43fe-ad1e-72f8d0c70cca"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.424301 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw6wr\" (UniqueName: \"kubernetes.io/projected/a44c789b-d197-43fe-ad1e-72f8d0c70cca-kube-api-access-sw6wr\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.424333 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.424343 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.424352 4764 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.424363 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a44c789b-d197-43fe-ad1e-72f8d0c70cca-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.562531 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-777fd8fcb-q2t4n"] Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.568576 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-777fd8fcb-q2t4n"] Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.696460 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.696737 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="ceilometer-central-agent" containerID="cri-o://f411eeaccf2673f77e018c8ef0e2682d69cb487f4a01d49604fa8a15ae088271" gracePeriod=30 Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.696854 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="ceilometer-notification-agent" containerID="cri-o://2e716dbca9226de637d42288bfc901dd242bfd4cff8aa515021580864b56c3b9" gracePeriod=30 Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.696907 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="proxy-httpd" containerID="cri-o://e53efd7b4cd2e4ac617486513058a911ccdd098013a3054796e5754aa44356a4" gracePeriod=30 Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.696897 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="sg-core" containerID="cri-o://2b9a04945a62ae4269a8453f474933fb79cc9b5c4eeb9991d33dd991e33ea7cc" gracePeriod=30 Oct 01 16:20:25 crc kubenswrapper[4764]: I1001 16:20:25.731637 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a44c789b-d197-43fe-ad1e-72f8d0c70cca" path="/var/lib/kubelet/pods/a44c789b-d197-43fe-ad1e-72f8d0c70cca/volumes" Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.255582 4764 generic.go:334] "Generic (PLEG): container finished" podID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerID="e53efd7b4cd2e4ac617486513058a911ccdd098013a3054796e5754aa44356a4" exitCode=0 Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.255858 4764 generic.go:334] "Generic (PLEG): container finished" podID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerID="2b9a04945a62ae4269a8453f474933fb79cc9b5c4eeb9991d33dd991e33ea7cc" exitCode=2 Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.255867 4764 generic.go:334] "Generic (PLEG): container finished" podID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerID="f411eeaccf2673f77e018c8ef0e2682d69cb487f4a01d49604fa8a15ae088271" exitCode=0 Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.255672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fe3fe7-36a4-4161-820e-dd33c7119f48","Type":"ContainerDied","Data":"e53efd7b4cd2e4ac617486513058a911ccdd098013a3054796e5754aa44356a4"} Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.256798 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fe3fe7-36a4-4161-820e-dd33c7119f48","Type":"ContainerDied","Data":"2b9a04945a62ae4269a8453f474933fb79cc9b5c4eeb9991d33dd991e33ea7cc"} Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.256818 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fe3fe7-36a4-4161-820e-dd33c7119f48","Type":"ContainerDied","Data":"f411eeaccf2673f77e018c8ef0e2682d69cb487f4a01d49604fa8a15ae088271"} Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.260347 4764 generic.go:334] "Generic (PLEG): container finished" podID="7418243b-5584-4f53-bded-549e34e415ca" containerID="7cea4baac5c06c4f44ec89fbfc9cc94b99aa97a611ac96f33ca49f1e92ad6d36" exitCode=0 Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.260384 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x59d8" event={"ID":"7418243b-5584-4f53-bded-549e34e415ca","Type":"ContainerDied","Data":"7cea4baac5c06c4f44ec89fbfc9cc94b99aa97a611ac96f33ca49f1e92ad6d36"} Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.262413 4764 generic.go:334] "Generic (PLEG): container finished" podID="3867e107-7704-4f0f-acf6-356bd78d71af" containerID="70fb3d7826816ce50f8cd372843fa913b9ad4a73f6f38c5d71ac1f12f0328310" exitCode=0 Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.262974 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mz6gj" event={"ID":"3867e107-7704-4f0f-acf6-356bd78d71af","Type":"ContainerDied","Data":"70fb3d7826816ce50f8cd372843fa913b9ad4a73f6f38c5d71ac1f12f0328310"} Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.613067 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t4ftm" Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.767001 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zkgr\" (UniqueName: \"kubernetes.io/projected/8756a942-4b09-43eb-b2a5-30048c7fa903-kube-api-access-8zkgr\") pod \"8756a942-4b09-43eb-b2a5-30048c7fa903\" (UID: \"8756a942-4b09-43eb-b2a5-30048c7fa903\") " Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.782176 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8756a942-4b09-43eb-b2a5-30048c7fa903-kube-api-access-8zkgr" (OuterVolumeSpecName: "kube-api-access-8zkgr") pod "8756a942-4b09-43eb-b2a5-30048c7fa903" (UID: "8756a942-4b09-43eb-b2a5-30048c7fa903"). InnerVolumeSpecName "kube-api-access-8zkgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:20:26 crc kubenswrapper[4764]: I1001 16:20:26.869501 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zkgr\" (UniqueName: \"kubernetes.io/projected/8756a942-4b09-43eb-b2a5-30048c7fa903-kube-api-access-8zkgr\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:27 crc kubenswrapper[4764]: I1001 16:20:27.274246 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t4ftm" event={"ID":"8756a942-4b09-43eb-b2a5-30048c7fa903","Type":"ContainerDied","Data":"8b26ceab60da0f9348ded78478eb5a49b2f41b5ae0b8abb8ef00355d67c7a092"} Oct 01 16:20:27 crc kubenswrapper[4764]: I1001 16:20:27.274300 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b26ceab60da0f9348ded78478eb5a49b2f41b5ae0b8abb8ef00355d67c7a092" Oct 01 16:20:27 crc kubenswrapper[4764]: I1001 16:20:27.274416 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t4ftm" Oct 01 16:20:27 crc kubenswrapper[4764]: I1001 16:20:27.644758 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mz6gj" Oct 01 16:20:27 crc kubenswrapper[4764]: I1001 16:20:27.769212 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x59d8" Oct 01 16:20:27 crc kubenswrapper[4764]: I1001 16:20:27.784666 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xxlk\" (UniqueName: \"kubernetes.io/projected/3867e107-7704-4f0f-acf6-356bd78d71af-kube-api-access-8xxlk\") pod \"3867e107-7704-4f0f-acf6-356bd78d71af\" (UID: \"3867e107-7704-4f0f-acf6-356bd78d71af\") " Oct 01 16:20:27 crc kubenswrapper[4764]: I1001 16:20:27.790265 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3867e107-7704-4f0f-acf6-356bd78d71af-kube-api-access-8xxlk" (OuterVolumeSpecName: "kube-api-access-8xxlk") pod "3867e107-7704-4f0f-acf6-356bd78d71af" (UID: "3867e107-7704-4f0f-acf6-356bd78d71af"). InnerVolumeSpecName "kube-api-access-8xxlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:20:27 crc kubenswrapper[4764]: I1001 16:20:27.886647 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t47tz\" (UniqueName: \"kubernetes.io/projected/7418243b-5584-4f53-bded-549e34e415ca-kube-api-access-t47tz\") pod \"7418243b-5584-4f53-bded-549e34e415ca\" (UID: \"7418243b-5584-4f53-bded-549e34e415ca\") " Oct 01 16:20:27 crc kubenswrapper[4764]: I1001 16:20:27.887379 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xxlk\" (UniqueName: \"kubernetes.io/projected/3867e107-7704-4f0f-acf6-356bd78d71af-kube-api-access-8xxlk\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:27 crc kubenswrapper[4764]: I1001 16:20:27.898253 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7418243b-5584-4f53-bded-549e34e415ca-kube-api-access-t47tz" (OuterVolumeSpecName: "kube-api-access-t47tz") pod "7418243b-5584-4f53-bded-549e34e415ca" (UID: "7418243b-5584-4f53-bded-549e34e415ca"). InnerVolumeSpecName "kube-api-access-t47tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:20:27 crc kubenswrapper[4764]: I1001 16:20:27.989347 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t47tz\" (UniqueName: \"kubernetes.io/projected/7418243b-5584-4f53-bded-549e34e415ca-kube-api-access-t47tz\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:28 crc kubenswrapper[4764]: I1001 16:20:28.283381 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x59d8" Oct 01 16:20:28 crc kubenswrapper[4764]: I1001 16:20:28.283358 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x59d8" event={"ID":"7418243b-5584-4f53-bded-549e34e415ca","Type":"ContainerDied","Data":"3d1353f09d54a4a3c3af273ddd4a5b52154e8f1ac86ba02d3e1c15c6d697eca6"} Oct 01 16:20:28 crc kubenswrapper[4764]: I1001 16:20:28.283477 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d1353f09d54a4a3c3af273ddd4a5b52154e8f1ac86ba02d3e1c15c6d697eca6" Oct 01 16:20:28 crc kubenswrapper[4764]: I1001 16:20:28.284798 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mz6gj" event={"ID":"3867e107-7704-4f0f-acf6-356bd78d71af","Type":"ContainerDied","Data":"97229ae29ec7696dfda9ccd76546c284b3888a74eb988727bda7f810f8226ef0"} Oct 01 16:20:28 crc kubenswrapper[4764]: I1001 16:20:28.284835 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97229ae29ec7696dfda9ccd76546c284b3888a74eb988727bda7f810f8226ef0" Oct 01 16:20:28 crc kubenswrapper[4764]: I1001 16:20:28.284836 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mz6gj" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.025326 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.142334 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-scripts\") pod \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.142395 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-config-data\") pod \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.142446 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-sg-core-conf-yaml\") pod \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.142489 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-combined-ca-bundle\") pod \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.142508 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8h9n\" (UniqueName: \"kubernetes.io/projected/d3fe3fe7-36a4-4161-820e-dd33c7119f48-kube-api-access-b8h9n\") pod \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.142528 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fe3fe7-36a4-4161-820e-dd33c7119f48-log-httpd\") pod \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.143648 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3fe3fe7-36a4-4161-820e-dd33c7119f48-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d3fe3fe7-36a4-4161-820e-dd33c7119f48" (UID: "d3fe3fe7-36a4-4161-820e-dd33c7119f48"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.143763 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fe3fe7-36a4-4161-820e-dd33c7119f48-run-httpd\") pod \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\" (UID: \"d3fe3fe7-36a4-4161-820e-dd33c7119f48\") " Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.144119 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3fe3fe7-36a4-4161-820e-dd33c7119f48-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d3fe3fe7-36a4-4161-820e-dd33c7119f48" (UID: "d3fe3fe7-36a4-4161-820e-dd33c7119f48"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.144574 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fe3fe7-36a4-4161-820e-dd33c7119f48-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.144589 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fe3fe7-36a4-4161-820e-dd33c7119f48-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.150186 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-scripts" (OuterVolumeSpecName: "scripts") pod "d3fe3fe7-36a4-4161-820e-dd33c7119f48" (UID: "d3fe3fe7-36a4-4161-820e-dd33c7119f48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.157298 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3fe3fe7-36a4-4161-820e-dd33c7119f48-kube-api-access-b8h9n" (OuterVolumeSpecName: "kube-api-access-b8h9n") pod "d3fe3fe7-36a4-4161-820e-dd33c7119f48" (UID: "d3fe3fe7-36a4-4161-820e-dd33c7119f48"). InnerVolumeSpecName "kube-api-access-b8h9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.171185 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d3fe3fe7-36a4-4161-820e-dd33c7119f48" (UID: "d3fe3fe7-36a4-4161-820e-dd33c7119f48"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.210611 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3fe3fe7-36a4-4161-820e-dd33c7119f48" (UID: "d3fe3fe7-36a4-4161-820e-dd33c7119f48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.239131 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-config-data" (OuterVolumeSpecName: "config-data") pod "d3fe3fe7-36a4-4161-820e-dd33c7119f48" (UID: "d3fe3fe7-36a4-4161-820e-dd33c7119f48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.246147 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.246168 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.246180 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.246189 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8h9n\" (UniqueName: \"kubernetes.io/projected/d3fe3fe7-36a4-4161-820e-dd33c7119f48-kube-api-access-b8h9n\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.246197 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fe3fe7-36a4-4161-820e-dd33c7119f48-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.311421 4764 generic.go:334] "Generic (PLEG): container finished" podID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerID="2e716dbca9226de637d42288bfc901dd242bfd4cff8aa515021580864b56c3b9" exitCode=0 Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.311674 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fe3fe7-36a4-4161-820e-dd33c7119f48","Type":"ContainerDied","Data":"2e716dbca9226de637d42288bfc901dd242bfd4cff8aa515021580864b56c3b9"} Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.311844 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fe3fe7-36a4-4161-820e-dd33c7119f48","Type":"ContainerDied","Data":"07bd4432280db87da4349f717a12a1ce9d137e4997b1405fde419e749fba758b"} Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.311913 4764 scope.go:117] "RemoveContainer" containerID="e53efd7b4cd2e4ac617486513058a911ccdd098013a3054796e5754aa44356a4" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.311697 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.338030 4764 scope.go:117] "RemoveContainer" containerID="2b9a04945a62ae4269a8453f474933fb79cc9b5c4eeb9991d33dd991e33ea7cc" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.342750 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.359502 4764 scope.go:117] "RemoveContainer" containerID="2e716dbca9226de637d42288bfc901dd242bfd4cff8aa515021580864b56c3b9" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.360646 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.384661 4764 scope.go:117] "RemoveContainer" containerID="f411eeaccf2673f77e018c8ef0e2682d69cb487f4a01d49604fa8a15ae088271" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.391953 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:20:31 crc kubenswrapper[4764]: E1001 16:20:31.392312 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7418243b-5584-4f53-bded-549e34e415ca" containerName="mariadb-database-create" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392332 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7418243b-5584-4f53-bded-549e34e415ca" containerName="mariadb-database-create" Oct 01 16:20:31 crc kubenswrapper[4764]: E1001 16:20:31.392356 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3867e107-7704-4f0f-acf6-356bd78d71af" containerName="mariadb-database-create" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392362 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3867e107-7704-4f0f-acf6-356bd78d71af" containerName="mariadb-database-create" Oct 01 16:20:31 crc kubenswrapper[4764]: E1001 16:20:31.392369 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="ceilometer-notification-agent" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392377 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="ceilometer-notification-agent" Oct 01 16:20:31 crc kubenswrapper[4764]: E1001 16:20:31.392387 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="sg-core" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392394 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="sg-core" Oct 01 16:20:31 crc kubenswrapper[4764]: E1001 16:20:31.392408 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="proxy-httpd" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392414 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="proxy-httpd" Oct 01 16:20:31 crc kubenswrapper[4764]: E1001 16:20:31.392429 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8756a942-4b09-43eb-b2a5-30048c7fa903" containerName="mariadb-database-create" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392435 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8756a942-4b09-43eb-b2a5-30048c7fa903" containerName="mariadb-database-create" Oct 01 16:20:31 crc kubenswrapper[4764]: E1001 16:20:31.392444 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="ceilometer-central-agent" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392449 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="ceilometer-central-agent" Oct 01 16:20:31 crc kubenswrapper[4764]: E1001 16:20:31.392456 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a44c789b-d197-43fe-ad1e-72f8d0c70cca" containerName="neutron-api" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392462 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a44c789b-d197-43fe-ad1e-72f8d0c70cca" containerName="neutron-api" Oct 01 16:20:31 crc kubenswrapper[4764]: E1001 16:20:31.392476 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a44c789b-d197-43fe-ad1e-72f8d0c70cca" containerName="neutron-httpd" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392481 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a44c789b-d197-43fe-ad1e-72f8d0c70cca" containerName="neutron-httpd" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392639 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8756a942-4b09-43eb-b2a5-30048c7fa903" containerName="mariadb-database-create" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392652 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="sg-core" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392665 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a44c789b-d197-43fe-ad1e-72f8d0c70cca" containerName="neutron-httpd" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392676 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7418243b-5584-4f53-bded-549e34e415ca" containerName="mariadb-database-create" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392683 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="ceilometer-central-agent" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392695 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a44c789b-d197-43fe-ad1e-72f8d0c70cca" containerName="neutron-api" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392704 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3867e107-7704-4f0f-acf6-356bd78d71af" containerName="mariadb-database-create" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392714 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="ceilometer-notification-agent" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.392724 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" containerName="proxy-httpd" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.394578 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.398211 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.398637 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.410548 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.419882 4764 scope.go:117] "RemoveContainer" containerID="e53efd7b4cd2e4ac617486513058a911ccdd098013a3054796e5754aa44356a4" Oct 01 16:20:31 crc kubenswrapper[4764]: E1001 16:20:31.421238 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e53efd7b4cd2e4ac617486513058a911ccdd098013a3054796e5754aa44356a4\": container with ID starting with e53efd7b4cd2e4ac617486513058a911ccdd098013a3054796e5754aa44356a4 not found: ID does not exist" containerID="e53efd7b4cd2e4ac617486513058a911ccdd098013a3054796e5754aa44356a4" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.421305 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e53efd7b4cd2e4ac617486513058a911ccdd098013a3054796e5754aa44356a4"} err="failed to get container status \"e53efd7b4cd2e4ac617486513058a911ccdd098013a3054796e5754aa44356a4\": rpc error: code = NotFound desc = could not find container \"e53efd7b4cd2e4ac617486513058a911ccdd098013a3054796e5754aa44356a4\": container with ID starting with e53efd7b4cd2e4ac617486513058a911ccdd098013a3054796e5754aa44356a4 not found: ID does not exist" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.421337 4764 scope.go:117] "RemoveContainer" containerID="2b9a04945a62ae4269a8453f474933fb79cc9b5c4eeb9991d33dd991e33ea7cc" Oct 01 16:20:31 crc kubenswrapper[4764]: E1001 16:20:31.421607 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9a04945a62ae4269a8453f474933fb79cc9b5c4eeb9991d33dd991e33ea7cc\": container with ID starting with 2b9a04945a62ae4269a8453f474933fb79cc9b5c4eeb9991d33dd991e33ea7cc not found: ID does not exist" containerID="2b9a04945a62ae4269a8453f474933fb79cc9b5c4eeb9991d33dd991e33ea7cc" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.421638 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9a04945a62ae4269a8453f474933fb79cc9b5c4eeb9991d33dd991e33ea7cc"} err="failed to get container status \"2b9a04945a62ae4269a8453f474933fb79cc9b5c4eeb9991d33dd991e33ea7cc\": rpc error: code = NotFound desc = could not find container \"2b9a04945a62ae4269a8453f474933fb79cc9b5c4eeb9991d33dd991e33ea7cc\": container with ID starting with 2b9a04945a62ae4269a8453f474933fb79cc9b5c4eeb9991d33dd991e33ea7cc not found: ID does not exist" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.421654 4764 scope.go:117] "RemoveContainer" containerID="2e716dbca9226de637d42288bfc901dd242bfd4cff8aa515021580864b56c3b9" Oct 01 16:20:31 crc kubenswrapper[4764]: E1001 16:20:31.421861 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e716dbca9226de637d42288bfc901dd242bfd4cff8aa515021580864b56c3b9\": container with ID starting with 2e716dbca9226de637d42288bfc901dd242bfd4cff8aa515021580864b56c3b9 not found: ID does not exist" containerID="2e716dbca9226de637d42288bfc901dd242bfd4cff8aa515021580864b56c3b9" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.421893 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e716dbca9226de637d42288bfc901dd242bfd4cff8aa515021580864b56c3b9"} err="failed to get container status \"2e716dbca9226de637d42288bfc901dd242bfd4cff8aa515021580864b56c3b9\": rpc error: code = NotFound desc = could not find container \"2e716dbca9226de637d42288bfc901dd242bfd4cff8aa515021580864b56c3b9\": container with ID starting with 2e716dbca9226de637d42288bfc901dd242bfd4cff8aa515021580864b56c3b9 not found: ID does not exist" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.421914 4764 scope.go:117] "RemoveContainer" containerID="f411eeaccf2673f77e018c8ef0e2682d69cb487f4a01d49604fa8a15ae088271" Oct 01 16:20:31 crc kubenswrapper[4764]: E1001 16:20:31.422133 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f411eeaccf2673f77e018c8ef0e2682d69cb487f4a01d49604fa8a15ae088271\": container with ID starting with f411eeaccf2673f77e018c8ef0e2682d69cb487f4a01d49604fa8a15ae088271 not found: ID does not exist" containerID="f411eeaccf2673f77e018c8ef0e2682d69cb487f4a01d49604fa8a15ae088271" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.422153 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f411eeaccf2673f77e018c8ef0e2682d69cb487f4a01d49604fa8a15ae088271"} err="failed to get container status \"f411eeaccf2673f77e018c8ef0e2682d69cb487f4a01d49604fa8a15ae088271\": rpc error: code = NotFound desc = could not find container \"f411eeaccf2673f77e018c8ef0e2682d69cb487f4a01d49604fa8a15ae088271\": container with ID starting with f411eeaccf2673f77e018c8ef0e2682d69cb487f4a01d49604fa8a15ae088271 not found: ID does not exist" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.449697 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/545e155b-997f-4d83-8837-ffb96dc22950-run-httpd\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.449946 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.450087 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-scripts\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.450155 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.450232 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/545e155b-997f-4d83-8837-ffb96dc22950-log-httpd\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.450298 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-config-data\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.450365 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5r9b\" (UniqueName: \"kubernetes.io/projected/545e155b-997f-4d83-8837-ffb96dc22950-kube-api-access-x5r9b\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.552081 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/545e155b-997f-4d83-8837-ffb96dc22950-run-httpd\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.552145 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.552181 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-scripts\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.552196 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.552224 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/545e155b-997f-4d83-8837-ffb96dc22950-log-httpd\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.552243 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-config-data\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.552266 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5r9b\" (UniqueName: \"kubernetes.io/projected/545e155b-997f-4d83-8837-ffb96dc22950-kube-api-access-x5r9b\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.553410 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/545e155b-997f-4d83-8837-ffb96dc22950-log-httpd\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.553574 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/545e155b-997f-4d83-8837-ffb96dc22950-run-httpd\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.557789 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.558311 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-config-data\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.559024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-scripts\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.559288 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.578541 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5r9b\" (UniqueName: \"kubernetes.io/projected/545e155b-997f-4d83-8837-ffb96dc22950-kube-api-access-x5r9b\") pod \"ceilometer-0\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.718892 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:20:31 crc kubenswrapper[4764]: I1001 16:20:31.741315 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3fe3fe7-36a4-4161-820e-dd33c7119f48" path="/var/lib/kubelet/pods/d3fe3fe7-36a4-4161-820e-dd33c7119f48/volumes" Oct 01 16:20:32 crc kubenswrapper[4764]: I1001 16:20:32.262720 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:20:32 crc kubenswrapper[4764]: I1001 16:20:32.322515 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"545e155b-997f-4d83-8837-ffb96dc22950","Type":"ContainerStarted","Data":"64be5f401e934ea0708e7938b4cdb65d4a1e7688c3aa5877f47d2e25968dcf62"} Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.331756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"545e155b-997f-4d83-8837-ffb96dc22950","Type":"ContainerStarted","Data":"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7"} Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.507274 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9ed4-account-create-q2w86"] Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.508670 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9ed4-account-create-q2w86" Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.518939 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.524450 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9ed4-account-create-q2w86"] Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.588163 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc7dj\" (UniqueName: \"kubernetes.io/projected/a8c79489-3d24-4631-b12e-3df33c87c1e0-kube-api-access-sc7dj\") pod \"nova-cell0-9ed4-account-create-q2w86\" (UID: \"a8c79489-3d24-4631-b12e-3df33c87c1e0\") " pod="openstack/nova-cell0-9ed4-account-create-q2w86" Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.692746 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc7dj\" (UniqueName: \"kubernetes.io/projected/a8c79489-3d24-4631-b12e-3df33c87c1e0-kube-api-access-sc7dj\") pod \"nova-cell0-9ed4-account-create-q2w86\" (UID: \"a8c79489-3d24-4631-b12e-3df33c87c1e0\") " pod="openstack/nova-cell0-9ed4-account-create-q2w86" Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.700175 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8701-account-create-79w9q"] Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.701978 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8701-account-create-79w9q" Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.704825 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.715629 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc7dj\" (UniqueName: \"kubernetes.io/projected/a8c79489-3d24-4631-b12e-3df33c87c1e0-kube-api-access-sc7dj\") pod \"nova-cell0-9ed4-account-create-q2w86\" (UID: \"a8c79489-3d24-4631-b12e-3df33c87c1e0\") " pod="openstack/nova-cell0-9ed4-account-create-q2w86" Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.719810 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8701-account-create-79w9q"] Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.905013 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9ed4-account-create-q2w86" Oct 01 16:20:33 crc kubenswrapper[4764]: I1001 16:20:33.906777 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8krs\" (UniqueName: \"kubernetes.io/projected/211211ae-6ed4-434c-86a7-f58ef4c5428b-kube-api-access-d8krs\") pod \"nova-cell1-8701-account-create-79w9q\" (UID: \"211211ae-6ed4-434c-86a7-f58ef4c5428b\") " pod="openstack/nova-cell1-8701-account-create-79w9q" Oct 01 16:20:34 crc kubenswrapper[4764]: I1001 16:20:34.008515 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8krs\" (UniqueName: \"kubernetes.io/projected/211211ae-6ed4-434c-86a7-f58ef4c5428b-kube-api-access-d8krs\") pod \"nova-cell1-8701-account-create-79w9q\" (UID: \"211211ae-6ed4-434c-86a7-f58ef4c5428b\") " pod="openstack/nova-cell1-8701-account-create-79w9q" Oct 01 16:20:34 crc kubenswrapper[4764]: I1001 16:20:34.034761 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8krs\" (UniqueName: \"kubernetes.io/projected/211211ae-6ed4-434c-86a7-f58ef4c5428b-kube-api-access-d8krs\") pod \"nova-cell1-8701-account-create-79w9q\" (UID: \"211211ae-6ed4-434c-86a7-f58ef4c5428b\") " pod="openstack/nova-cell1-8701-account-create-79w9q" Oct 01 16:20:34 crc kubenswrapper[4764]: I1001 16:20:34.319193 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8701-account-create-79w9q" Oct 01 16:20:34 crc kubenswrapper[4764]: I1001 16:20:34.342973 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"545e155b-997f-4d83-8837-ffb96dc22950","Type":"ContainerStarted","Data":"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4"} Oct 01 16:20:34 crc kubenswrapper[4764]: I1001 16:20:34.466362 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9ed4-account-create-q2w86"] Oct 01 16:20:34 crc kubenswrapper[4764]: I1001 16:20:34.815454 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8701-account-create-79w9q"] Oct 01 16:20:34 crc kubenswrapper[4764]: W1001 16:20:34.816610 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod211211ae_6ed4_434c_86a7_f58ef4c5428b.slice/crio-fed47835156e26df835ef298c1c2590df5aa7103685d44b517caa1f177f061cd WatchSource:0}: Error finding container fed47835156e26df835ef298c1c2590df5aa7103685d44b517caa1f177f061cd: Status 404 returned error can't find the container with id fed47835156e26df835ef298c1c2590df5aa7103685d44b517caa1f177f061cd Oct 01 16:20:35 crc kubenswrapper[4764]: I1001 16:20:35.365758 4764 generic.go:334] "Generic (PLEG): container finished" podID="211211ae-6ed4-434c-86a7-f58ef4c5428b" containerID="3c296db50d560a8194c5e3e898dea0151039b5fa1c97fe541c0da657dfe76449" exitCode=0 Oct 01 16:20:35 crc kubenswrapper[4764]: I1001 16:20:35.365828 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8701-account-create-79w9q" event={"ID":"211211ae-6ed4-434c-86a7-f58ef4c5428b","Type":"ContainerDied","Data":"3c296db50d560a8194c5e3e898dea0151039b5fa1c97fe541c0da657dfe76449"} Oct 01 16:20:35 crc kubenswrapper[4764]: I1001 16:20:35.365890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8701-account-create-79w9q" event={"ID":"211211ae-6ed4-434c-86a7-f58ef4c5428b","Type":"ContainerStarted","Data":"fed47835156e26df835ef298c1c2590df5aa7103685d44b517caa1f177f061cd"} Oct 01 16:20:35 crc kubenswrapper[4764]: I1001 16:20:35.372278 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"545e155b-997f-4d83-8837-ffb96dc22950","Type":"ContainerStarted","Data":"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88"} Oct 01 16:20:35 crc kubenswrapper[4764]: I1001 16:20:35.374764 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8c79489-3d24-4631-b12e-3df33c87c1e0" containerID="c9b1c379e6188dbd099eba3b23671f6ec67044172ef7307a10e979cb87f578db" exitCode=0 Oct 01 16:20:35 crc kubenswrapper[4764]: I1001 16:20:35.374835 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9ed4-account-create-q2w86" event={"ID":"a8c79489-3d24-4631-b12e-3df33c87c1e0","Type":"ContainerDied","Data":"c9b1c379e6188dbd099eba3b23671f6ec67044172ef7307a10e979cb87f578db"} Oct 01 16:20:35 crc kubenswrapper[4764]: I1001 16:20:35.374862 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9ed4-account-create-q2w86" event={"ID":"a8c79489-3d24-4631-b12e-3df33c87c1e0","Type":"ContainerStarted","Data":"f69ce61a1c7cd4b6230d9932af74882a659857df5f864392a222010fb51de086"} Oct 01 16:20:36 crc kubenswrapper[4764]: I1001 16:20:36.394137 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"545e155b-997f-4d83-8837-ffb96dc22950","Type":"ContainerStarted","Data":"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7"} Oct 01 16:20:36 crc kubenswrapper[4764]: I1001 16:20:36.443561 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.13821214 podStartE2EDuration="5.443527769s" podCreationTimestamp="2025-10-01 16:20:31 +0000 UTC" firstStartedPulling="2025-10-01 16:20:32.290257061 +0000 UTC m=+1095.289903896" lastFinishedPulling="2025-10-01 16:20:35.59557268 +0000 UTC m=+1098.595219525" observedRunningTime="2025-10-01 16:20:36.428896167 +0000 UTC m=+1099.428543072" watchObservedRunningTime="2025-10-01 16:20:36.443527769 +0000 UTC m=+1099.443174644" Oct 01 16:20:36 crc kubenswrapper[4764]: I1001 16:20:36.832302 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8701-account-create-79w9q" Oct 01 16:20:36 crc kubenswrapper[4764]: I1001 16:20:36.837936 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9ed4-account-create-q2w86" Oct 01 16:20:36 crc kubenswrapper[4764]: I1001 16:20:36.965541 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc7dj\" (UniqueName: \"kubernetes.io/projected/a8c79489-3d24-4631-b12e-3df33c87c1e0-kube-api-access-sc7dj\") pod \"a8c79489-3d24-4631-b12e-3df33c87c1e0\" (UID: \"a8c79489-3d24-4631-b12e-3df33c87c1e0\") " Oct 01 16:20:36 crc kubenswrapper[4764]: I1001 16:20:36.965672 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8krs\" (UniqueName: \"kubernetes.io/projected/211211ae-6ed4-434c-86a7-f58ef4c5428b-kube-api-access-d8krs\") pod \"211211ae-6ed4-434c-86a7-f58ef4c5428b\" (UID: \"211211ae-6ed4-434c-86a7-f58ef4c5428b\") " Oct 01 16:20:36 crc kubenswrapper[4764]: I1001 16:20:36.972746 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c79489-3d24-4631-b12e-3df33c87c1e0-kube-api-access-sc7dj" (OuterVolumeSpecName: "kube-api-access-sc7dj") pod "a8c79489-3d24-4631-b12e-3df33c87c1e0" (UID: "a8c79489-3d24-4631-b12e-3df33c87c1e0"). InnerVolumeSpecName "kube-api-access-sc7dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:20:36 crc kubenswrapper[4764]: I1001 16:20:36.985430 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211211ae-6ed4-434c-86a7-f58ef4c5428b-kube-api-access-d8krs" (OuterVolumeSpecName: "kube-api-access-d8krs") pod "211211ae-6ed4-434c-86a7-f58ef4c5428b" (UID: "211211ae-6ed4-434c-86a7-f58ef4c5428b"). InnerVolumeSpecName "kube-api-access-d8krs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:20:37 crc kubenswrapper[4764]: I1001 16:20:37.067828 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc7dj\" (UniqueName: \"kubernetes.io/projected/a8c79489-3d24-4631-b12e-3df33c87c1e0-kube-api-access-sc7dj\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:37 crc kubenswrapper[4764]: I1001 16:20:37.067864 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8krs\" (UniqueName: \"kubernetes.io/projected/211211ae-6ed4-434c-86a7-f58ef4c5428b-kube-api-access-d8krs\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:37 crc kubenswrapper[4764]: I1001 16:20:37.404799 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9ed4-account-create-q2w86" event={"ID":"a8c79489-3d24-4631-b12e-3df33c87c1e0","Type":"ContainerDied","Data":"f69ce61a1c7cd4b6230d9932af74882a659857df5f864392a222010fb51de086"} Oct 01 16:20:37 crc kubenswrapper[4764]: I1001 16:20:37.404844 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f69ce61a1c7cd4b6230d9932af74882a659857df5f864392a222010fb51de086" Oct 01 16:20:37 crc kubenswrapper[4764]: I1001 16:20:37.404912 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9ed4-account-create-q2w86" Oct 01 16:20:37 crc kubenswrapper[4764]: I1001 16:20:37.408804 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8701-account-create-79w9q" event={"ID":"211211ae-6ed4-434c-86a7-f58ef4c5428b","Type":"ContainerDied","Data":"fed47835156e26df835ef298c1c2590df5aa7103685d44b517caa1f177f061cd"} Oct 01 16:20:37 crc kubenswrapper[4764]: I1001 16:20:37.408862 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8701-account-create-79w9q" Oct 01 16:20:37 crc kubenswrapper[4764]: I1001 16:20:37.408880 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fed47835156e26df835ef298c1c2590df5aa7103685d44b517caa1f177f061cd" Oct 01 16:20:37 crc kubenswrapper[4764]: I1001 16:20:37.409075 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.726491 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dvptb"] Oct 01 16:20:38 crc kubenswrapper[4764]: E1001 16:20:38.728392 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c79489-3d24-4631-b12e-3df33c87c1e0" containerName="mariadb-account-create" Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.728504 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c79489-3d24-4631-b12e-3df33c87c1e0" containerName="mariadb-account-create" Oct 01 16:20:38 crc kubenswrapper[4764]: E1001 16:20:38.728608 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="211211ae-6ed4-434c-86a7-f58ef4c5428b" containerName="mariadb-account-create" Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.728680 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="211211ae-6ed4-434c-86a7-f58ef4c5428b" containerName="mariadb-account-create" Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.728959 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c79489-3d24-4631-b12e-3df33c87c1e0" containerName="mariadb-account-create" Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.729083 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="211211ae-6ed4-434c-86a7-f58ef4c5428b" containerName="mariadb-account-create" Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.729869 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.731565 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.731966 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.735712 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dvptb"] Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.735987 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dhrd7" Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.901947 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-config-data\") pod \"nova-cell0-conductor-db-sync-dvptb\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.902022 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dvptb\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.902123 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-scripts\") pod \"nova-cell0-conductor-db-sync-dvptb\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:38 crc kubenswrapper[4764]: I1001 16:20:38.902253 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwgnb\" (UniqueName: \"kubernetes.io/projected/b0e44383-9ccd-4abc-9cce-aab97cce1388-kube-api-access-xwgnb\") pod \"nova-cell0-conductor-db-sync-dvptb\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.005992 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-config-data\") pod \"nova-cell0-conductor-db-sync-dvptb\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.006136 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dvptb\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.006173 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-scripts\") pod \"nova-cell0-conductor-db-sync-dvptb\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.006257 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwgnb\" (UniqueName: \"kubernetes.io/projected/b0e44383-9ccd-4abc-9cce-aab97cce1388-kube-api-access-xwgnb\") pod \"nova-cell0-conductor-db-sync-dvptb\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.013829 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-config-data\") pod \"nova-cell0-conductor-db-sync-dvptb\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.013904 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-scripts\") pod \"nova-cell0-conductor-db-sync-dvptb\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.014552 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dvptb\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.026062 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwgnb\" (UniqueName: \"kubernetes.io/projected/b0e44383-9ccd-4abc-9cce-aab97cce1388-kube-api-access-xwgnb\") pod \"nova-cell0-conductor-db-sync-dvptb\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.048382 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.462442 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.462922 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="ceilometer-central-agent" containerID="cri-o://75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7" gracePeriod=30 Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.463014 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="proxy-httpd" containerID="cri-o://83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7" gracePeriod=30 Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.463067 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="ceilometer-notification-agent" containerID="cri-o://191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4" gracePeriod=30 Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.463018 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="sg-core" containerID="cri-o://9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88" gracePeriod=30 Oct 01 16:20:39 crc kubenswrapper[4764]: I1001 16:20:39.511763 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dvptb"] Oct 01 16:20:39 crc kubenswrapper[4764]: W1001 16:20:39.519558 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0e44383_9ccd_4abc_9cce_aab97cce1388.slice/crio-d2ef157e9da9b845f709c14f3732f72e0803eeaca7c8a54f89a25c3704e2818d WatchSource:0}: Error finding container d2ef157e9da9b845f709c14f3732f72e0803eeaca7c8a54f89a25c3704e2818d: Status 404 returned error can't find the container with id d2ef157e9da9b845f709c14f3732f72e0803eeaca7c8a54f89a25c3704e2818d Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.265981 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.447698 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/545e155b-997f-4d83-8837-ffb96dc22950-run-httpd\") pod \"545e155b-997f-4d83-8837-ffb96dc22950\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.447766 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/545e155b-997f-4d83-8837-ffb96dc22950-log-httpd\") pod \"545e155b-997f-4d83-8837-ffb96dc22950\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.447831 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5r9b\" (UniqueName: \"kubernetes.io/projected/545e155b-997f-4d83-8837-ffb96dc22950-kube-api-access-x5r9b\") pod \"545e155b-997f-4d83-8837-ffb96dc22950\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.447971 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-combined-ca-bundle\") pod \"545e155b-997f-4d83-8837-ffb96dc22950\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.448023 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-scripts\") pod \"545e155b-997f-4d83-8837-ffb96dc22950\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.448085 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-config-data\") pod \"545e155b-997f-4d83-8837-ffb96dc22950\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.448142 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-sg-core-conf-yaml\") pod \"545e155b-997f-4d83-8837-ffb96dc22950\" (UID: \"545e155b-997f-4d83-8837-ffb96dc22950\") " Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.450607 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/545e155b-997f-4d83-8837-ffb96dc22950-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "545e155b-997f-4d83-8837-ffb96dc22950" (UID: "545e155b-997f-4d83-8837-ffb96dc22950"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.450868 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/545e155b-997f-4d83-8837-ffb96dc22950-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "545e155b-997f-4d83-8837-ffb96dc22950" (UID: "545e155b-997f-4d83-8837-ffb96dc22950"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.451360 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/545e155b-997f-4d83-8837-ffb96dc22950-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.451464 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/545e155b-997f-4d83-8837-ffb96dc22950-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.470495 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/545e155b-997f-4d83-8837-ffb96dc22950-kube-api-access-x5r9b" (OuterVolumeSpecName: "kube-api-access-x5r9b") pod "545e155b-997f-4d83-8837-ffb96dc22950" (UID: "545e155b-997f-4d83-8837-ffb96dc22950"). InnerVolumeSpecName "kube-api-access-x5r9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.483240 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-scripts" (OuterVolumeSpecName: "scripts") pod "545e155b-997f-4d83-8837-ffb96dc22950" (UID: "545e155b-997f-4d83-8837-ffb96dc22950"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.483443 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "545e155b-997f-4d83-8837-ffb96dc22950" (UID: "545e155b-997f-4d83-8837-ffb96dc22950"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.487498 4764 generic.go:334] "Generic (PLEG): container finished" podID="545e155b-997f-4d83-8837-ffb96dc22950" containerID="83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7" exitCode=0 Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.487538 4764 generic.go:334] "Generic (PLEG): container finished" podID="545e155b-997f-4d83-8837-ffb96dc22950" containerID="9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88" exitCode=2 Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.487548 4764 generic.go:334] "Generic (PLEG): container finished" podID="545e155b-997f-4d83-8837-ffb96dc22950" containerID="191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4" exitCode=0 Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.487557 4764 generic.go:334] "Generic (PLEG): container finished" podID="545e155b-997f-4d83-8837-ffb96dc22950" containerID="75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7" exitCode=0 Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.487647 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.487645 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"545e155b-997f-4d83-8837-ffb96dc22950","Type":"ContainerDied","Data":"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7"} Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.487684 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"545e155b-997f-4d83-8837-ffb96dc22950","Type":"ContainerDied","Data":"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88"} Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.487698 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"545e155b-997f-4d83-8837-ffb96dc22950","Type":"ContainerDied","Data":"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4"} Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.487711 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"545e155b-997f-4d83-8837-ffb96dc22950","Type":"ContainerDied","Data":"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7"} Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.487723 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"545e155b-997f-4d83-8837-ffb96dc22950","Type":"ContainerDied","Data":"64be5f401e934ea0708e7938b4cdb65d4a1e7688c3aa5877f47d2e25968dcf62"} Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.487742 4764 scope.go:117] "RemoveContainer" containerID="83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.492457 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dvptb" event={"ID":"b0e44383-9ccd-4abc-9cce-aab97cce1388","Type":"ContainerStarted","Data":"d2ef157e9da9b845f709c14f3732f72e0803eeaca7c8a54f89a25c3704e2818d"} Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.552793 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.553140 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.553214 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5r9b\" (UniqueName: \"kubernetes.io/projected/545e155b-997f-4d83-8837-ffb96dc22950-kube-api-access-x5r9b\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.552950 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-config-data" (OuterVolumeSpecName: "config-data") pod "545e155b-997f-4d83-8837-ffb96dc22950" (UID: "545e155b-997f-4d83-8837-ffb96dc22950"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.556106 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "545e155b-997f-4d83-8837-ffb96dc22950" (UID: "545e155b-997f-4d83-8837-ffb96dc22950"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.570478 4764 scope.go:117] "RemoveContainer" containerID="9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.597668 4764 scope.go:117] "RemoveContainer" containerID="191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.616928 4764 scope.go:117] "RemoveContainer" containerID="75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.638929 4764 scope.go:117] "RemoveContainer" containerID="83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7" Oct 01 16:20:40 crc kubenswrapper[4764]: E1001 16:20:40.639447 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7\": container with ID starting with 83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7 not found: ID does not exist" containerID="83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.639492 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7"} err="failed to get container status \"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7\": rpc error: code = NotFound desc = could not find container \"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7\": container with ID starting with 83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.639546 4764 scope.go:117] "RemoveContainer" containerID="9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88" Oct 01 16:20:40 crc kubenswrapper[4764]: E1001 16:20:40.640261 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88\": container with ID starting with 9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88 not found: ID does not exist" containerID="9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.640381 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88"} err="failed to get container status \"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88\": rpc error: code = NotFound desc = could not find container \"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88\": container with ID starting with 9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.640417 4764 scope.go:117] "RemoveContainer" containerID="191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4" Oct 01 16:20:40 crc kubenswrapper[4764]: E1001 16:20:40.640810 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4\": container with ID starting with 191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4 not found: ID does not exist" containerID="191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.640840 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4"} err="failed to get container status \"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4\": rpc error: code = NotFound desc = could not find container \"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4\": container with ID starting with 191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.640858 4764 scope.go:117] "RemoveContainer" containerID="75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7" Oct 01 16:20:40 crc kubenswrapper[4764]: E1001 16:20:40.641268 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7\": container with ID starting with 75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7 not found: ID does not exist" containerID="75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.641319 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7"} err="failed to get container status \"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7\": rpc error: code = NotFound desc = could not find container \"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7\": container with ID starting with 75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.641352 4764 scope.go:117] "RemoveContainer" containerID="83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.641548 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7"} err="failed to get container status \"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7\": rpc error: code = NotFound desc = could not find container \"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7\": container with ID starting with 83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.641567 4764 scope.go:117] "RemoveContainer" containerID="9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.641833 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88"} err="failed to get container status \"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88\": rpc error: code = NotFound desc = could not find container \"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88\": container with ID starting with 9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.641849 4764 scope.go:117] "RemoveContainer" containerID="191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.642063 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4"} err="failed to get container status \"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4\": rpc error: code = NotFound desc = could not find container \"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4\": container with ID starting with 191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.642083 4764 scope.go:117] "RemoveContainer" containerID="75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.642260 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7"} err="failed to get container status \"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7\": rpc error: code = NotFound desc = could not find container \"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7\": container with ID starting with 75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.642277 4764 scope.go:117] "RemoveContainer" containerID="83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.642493 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7"} err="failed to get container status \"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7\": rpc error: code = NotFound desc = could not find container \"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7\": container with ID starting with 83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.642514 4764 scope.go:117] "RemoveContainer" containerID="9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.642968 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88"} err="failed to get container status \"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88\": rpc error: code = NotFound desc = could not find container \"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88\": container with ID starting with 9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.642997 4764 scope.go:117] "RemoveContainer" containerID="191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.643227 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4"} err="failed to get container status \"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4\": rpc error: code = NotFound desc = could not find container \"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4\": container with ID starting with 191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.643253 4764 scope.go:117] "RemoveContainer" containerID="75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.643504 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7"} err="failed to get container status \"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7\": rpc error: code = NotFound desc = could not find container \"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7\": container with ID starting with 75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.643538 4764 scope.go:117] "RemoveContainer" containerID="83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.643909 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7"} err="failed to get container status \"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7\": rpc error: code = NotFound desc = could not find container \"83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7\": container with ID starting with 83a55cb02d9200a632e461362c5747569faf0447999be5e12eb34a19f8b05fe7 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.643933 4764 scope.go:117] "RemoveContainer" containerID="9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.644249 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88"} err="failed to get container status \"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88\": rpc error: code = NotFound desc = could not find container \"9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88\": container with ID starting with 9d13820c31145a05daed17442c8ae6230fef041097a7de66f94d21c91225fa88 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.644276 4764 scope.go:117] "RemoveContainer" containerID="191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.644627 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4"} err="failed to get container status \"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4\": rpc error: code = NotFound desc = could not find container \"191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4\": container with ID starting with 191cdf4d9c96132da1a3e74ed37bf265d42bd6baf669d78033b068420172afe4 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.644654 4764 scope.go:117] "RemoveContainer" containerID="75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.645021 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7"} err="failed to get container status \"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7\": rpc error: code = NotFound desc = could not find container \"75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7\": container with ID starting with 75e0462b939ed409c422af571d3fdb75db3a5deb6df1108783ba30218e0a49a7 not found: ID does not exist" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.654299 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.654324 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/545e155b-997f-4d83-8837-ffb96dc22950-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.841114 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.850534 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.866820 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:20:40 crc kubenswrapper[4764]: E1001 16:20:40.867384 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="proxy-httpd" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.867484 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="proxy-httpd" Oct 01 16:20:40 crc kubenswrapper[4764]: E1001 16:20:40.867559 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="sg-core" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.867619 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="sg-core" Oct 01 16:20:40 crc kubenswrapper[4764]: E1001 16:20:40.867678 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="ceilometer-central-agent" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.867733 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="ceilometer-central-agent" Oct 01 16:20:40 crc kubenswrapper[4764]: E1001 16:20:40.867809 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="ceilometer-notification-agent" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.867883 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="ceilometer-notification-agent" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.868122 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="sg-core" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.868191 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="proxy-httpd" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.868252 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="ceilometer-notification-agent" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.868325 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="545e155b-997f-4d83-8837-ffb96dc22950" containerName="ceilometer-central-agent" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.870086 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.873915 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.874384 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.882279 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.964447 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.964520 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-scripts\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.964564 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605b1c9-ed82-4e24-b150-f966fd09d0f2-log-httpd\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.964593 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605b1c9-ed82-4e24-b150-f966fd09d0f2-run-httpd\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.964648 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-config-data\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.964700 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkt5j\" (UniqueName: \"kubernetes.io/projected/9605b1c9-ed82-4e24-b150-f966fd09d0f2-kube-api-access-hkt5j\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:40 crc kubenswrapper[4764]: I1001 16:20:40.964737 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.071029 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.071142 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.071186 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-scripts\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.071222 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605b1c9-ed82-4e24-b150-f966fd09d0f2-log-httpd\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.071254 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605b1c9-ed82-4e24-b150-f966fd09d0f2-run-httpd\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.071308 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-config-data\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.071363 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkt5j\" (UniqueName: \"kubernetes.io/projected/9605b1c9-ed82-4e24-b150-f966fd09d0f2-kube-api-access-hkt5j\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.071840 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605b1c9-ed82-4e24-b150-f966fd09d0f2-run-httpd\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.072119 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605b1c9-ed82-4e24-b150-f966fd09d0f2-log-httpd\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.075784 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.075843 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-config-data\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.082127 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-scripts\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.086739 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.087799 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkt5j\" (UniqueName: \"kubernetes.io/projected/9605b1c9-ed82-4e24-b150-f966fd09d0f2-kube-api-access-hkt5j\") pod \"ceilometer-0\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.193218 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.656928 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:20:41 crc kubenswrapper[4764]: I1001 16:20:41.734317 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="545e155b-997f-4d83-8837-ffb96dc22950" path="/var/lib/kubelet/pods/545e155b-997f-4d83-8837-ffb96dc22950/volumes" Oct 01 16:20:42 crc kubenswrapper[4764]: I1001 16:20:42.519844 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605b1c9-ed82-4e24-b150-f966fd09d0f2","Type":"ContainerStarted","Data":"848cb876ceb49c147edb0ac3a1c2c3f694c01caf1df581b2b28eb861e68ef40a"} Oct 01 16:20:43 crc kubenswrapper[4764]: I1001 16:20:43.386535 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-af0f-account-create-xhcdp"] Oct 01 16:20:43 crc kubenswrapper[4764]: I1001 16:20:43.388156 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-af0f-account-create-xhcdp" Oct 01 16:20:43 crc kubenswrapper[4764]: I1001 16:20:43.390938 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 01 16:20:43 crc kubenswrapper[4764]: I1001 16:20:43.393089 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-af0f-account-create-xhcdp"] Oct 01 16:20:43 crc kubenswrapper[4764]: I1001 16:20:43.529540 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frvxt\" (UniqueName: \"kubernetes.io/projected/4b789fde-2d6b-41ab-bdfb-8a3071d969f5-kube-api-access-frvxt\") pod \"nova-api-af0f-account-create-xhcdp\" (UID: \"4b789fde-2d6b-41ab-bdfb-8a3071d969f5\") " pod="openstack/nova-api-af0f-account-create-xhcdp" Oct 01 16:20:43 crc kubenswrapper[4764]: I1001 16:20:43.533083 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605b1c9-ed82-4e24-b150-f966fd09d0f2","Type":"ContainerStarted","Data":"58e9d540739bc1d248198600d53c93fe3ad611c000c70a55d8dff8637923c4e0"} Oct 01 16:20:43 crc kubenswrapper[4764]: I1001 16:20:43.631416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frvxt\" (UniqueName: \"kubernetes.io/projected/4b789fde-2d6b-41ab-bdfb-8a3071d969f5-kube-api-access-frvxt\") pod \"nova-api-af0f-account-create-xhcdp\" (UID: \"4b789fde-2d6b-41ab-bdfb-8a3071d969f5\") " pod="openstack/nova-api-af0f-account-create-xhcdp" Oct 01 16:20:43 crc kubenswrapper[4764]: I1001 16:20:43.658842 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frvxt\" (UniqueName: \"kubernetes.io/projected/4b789fde-2d6b-41ab-bdfb-8a3071d969f5-kube-api-access-frvxt\") pod \"nova-api-af0f-account-create-xhcdp\" (UID: \"4b789fde-2d6b-41ab-bdfb-8a3071d969f5\") " pod="openstack/nova-api-af0f-account-create-xhcdp" Oct 01 16:20:43 crc kubenswrapper[4764]: I1001 16:20:43.725434 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-af0f-account-create-xhcdp" Oct 01 16:20:50 crc kubenswrapper[4764]: W1001 16:20:50.307975 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b789fde_2d6b_41ab_bdfb_8a3071d969f5.slice/crio-17a95cf2b22962f5bb3c21a53022edf87db5e6db1c117b6dcf8a6e7de189b48c WatchSource:0}: Error finding container 17a95cf2b22962f5bb3c21a53022edf87db5e6db1c117b6dcf8a6e7de189b48c: Status 404 returned error can't find the container with id 17a95cf2b22962f5bb3c21a53022edf87db5e6db1c117b6dcf8a6e7de189b48c Oct 01 16:20:50 crc kubenswrapper[4764]: I1001 16:20:50.309037 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-af0f-account-create-xhcdp"] Oct 01 16:20:50 crc kubenswrapper[4764]: I1001 16:20:50.603392 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605b1c9-ed82-4e24-b150-f966fd09d0f2","Type":"ContainerStarted","Data":"b5aefc8d50296d655a2870014ef6aa207b8edda0b6696089503036bb8f852194"} Oct 01 16:20:50 crc kubenswrapper[4764]: I1001 16:20:50.605872 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dvptb" event={"ID":"b0e44383-9ccd-4abc-9cce-aab97cce1388","Type":"ContainerStarted","Data":"e93aa8dcb65cb5085dc7f7cd9bbe91cf5fd98006535dba31bd46aa42e7804e4b"} Oct 01 16:20:50 crc kubenswrapper[4764]: I1001 16:20:50.608721 4764 generic.go:334] "Generic (PLEG): container finished" podID="4b789fde-2d6b-41ab-bdfb-8a3071d969f5" containerID="bd63c8eafdb2a5f37ac4d05c15c3e6fee2e9a97cb0ab101b9005fc7db2ee1a4a" exitCode=0 Oct 01 16:20:50 crc kubenswrapper[4764]: I1001 16:20:50.608814 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-af0f-account-create-xhcdp" event={"ID":"4b789fde-2d6b-41ab-bdfb-8a3071d969f5","Type":"ContainerDied","Data":"bd63c8eafdb2a5f37ac4d05c15c3e6fee2e9a97cb0ab101b9005fc7db2ee1a4a"} Oct 01 16:20:50 crc kubenswrapper[4764]: I1001 16:20:50.608857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-af0f-account-create-xhcdp" event={"ID":"4b789fde-2d6b-41ab-bdfb-8a3071d969f5","Type":"ContainerStarted","Data":"17a95cf2b22962f5bb3c21a53022edf87db5e6db1c117b6dcf8a6e7de189b48c"} Oct 01 16:20:50 crc kubenswrapper[4764]: I1001 16:20:50.624964 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-dvptb" podStartSLOduration=2.18884438 podStartE2EDuration="12.624947285s" podCreationTimestamp="2025-10-01 16:20:38 +0000 UTC" firstStartedPulling="2025-10-01 16:20:39.52161319 +0000 UTC m=+1102.521260015" lastFinishedPulling="2025-10-01 16:20:49.957716085 +0000 UTC m=+1112.957362920" observedRunningTime="2025-10-01 16:20:50.620526366 +0000 UTC m=+1113.620173211" watchObservedRunningTime="2025-10-01 16:20:50.624947285 +0000 UTC m=+1113.624594130" Oct 01 16:20:51 crc kubenswrapper[4764]: I1001 16:20:51.622738 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605b1c9-ed82-4e24-b150-f966fd09d0f2","Type":"ContainerStarted","Data":"03d3cc5a13f44af7e7ac35d0e927c95fd1668571e56adf6d52ffd6a49939ff37"} Oct 01 16:20:52 crc kubenswrapper[4764]: I1001 16:20:52.071364 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-af0f-account-create-xhcdp" Oct 01 16:20:52 crc kubenswrapper[4764]: I1001 16:20:52.220728 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frvxt\" (UniqueName: \"kubernetes.io/projected/4b789fde-2d6b-41ab-bdfb-8a3071d969f5-kube-api-access-frvxt\") pod \"4b789fde-2d6b-41ab-bdfb-8a3071d969f5\" (UID: \"4b789fde-2d6b-41ab-bdfb-8a3071d969f5\") " Oct 01 16:20:52 crc kubenswrapper[4764]: I1001 16:20:52.233445 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b789fde-2d6b-41ab-bdfb-8a3071d969f5-kube-api-access-frvxt" (OuterVolumeSpecName: "kube-api-access-frvxt") pod "4b789fde-2d6b-41ab-bdfb-8a3071d969f5" (UID: "4b789fde-2d6b-41ab-bdfb-8a3071d969f5"). InnerVolumeSpecName "kube-api-access-frvxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:20:52 crc kubenswrapper[4764]: I1001 16:20:52.323409 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frvxt\" (UniqueName: \"kubernetes.io/projected/4b789fde-2d6b-41ab-bdfb-8a3071d969f5-kube-api-access-frvxt\") on node \"crc\" DevicePath \"\"" Oct 01 16:20:52 crc kubenswrapper[4764]: I1001 16:20:52.644601 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-af0f-account-create-xhcdp" event={"ID":"4b789fde-2d6b-41ab-bdfb-8a3071d969f5","Type":"ContainerDied","Data":"17a95cf2b22962f5bb3c21a53022edf87db5e6db1c117b6dcf8a6e7de189b48c"} Oct 01 16:20:52 crc kubenswrapper[4764]: I1001 16:20:52.644670 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17a95cf2b22962f5bb3c21a53022edf87db5e6db1c117b6dcf8a6e7de189b48c" Oct 01 16:20:52 crc kubenswrapper[4764]: I1001 16:20:52.644773 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-af0f-account-create-xhcdp" Oct 01 16:20:53 crc kubenswrapper[4764]: I1001 16:20:53.657955 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605b1c9-ed82-4e24-b150-f966fd09d0f2","Type":"ContainerStarted","Data":"d7989c3875da0c28d8556c4cebc844efab4a1ac5324f5f77e0314f8d7ba22760"} Oct 01 16:20:53 crc kubenswrapper[4764]: I1001 16:20:53.658376 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 16:20:53 crc kubenswrapper[4764]: I1001 16:20:53.688671 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.390621386 podStartE2EDuration="13.688645494s" podCreationTimestamp="2025-10-01 16:20:40 +0000 UTC" firstStartedPulling="2025-10-01 16:20:41.673887747 +0000 UTC m=+1104.673534582" lastFinishedPulling="2025-10-01 16:20:52.971911815 +0000 UTC m=+1115.971558690" observedRunningTime="2025-10-01 16:20:53.688223884 +0000 UTC m=+1116.687870729" watchObservedRunningTime="2025-10-01 16:20:53.688645494 +0000 UTC m=+1116.688292369" Oct 01 16:21:01 crc kubenswrapper[4764]: I1001 16:21:01.752476 4764 generic.go:334] "Generic (PLEG): container finished" podID="b0e44383-9ccd-4abc-9cce-aab97cce1388" containerID="e93aa8dcb65cb5085dc7f7cd9bbe91cf5fd98006535dba31bd46aa42e7804e4b" exitCode=0 Oct 01 16:21:01 crc kubenswrapper[4764]: I1001 16:21:01.752557 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dvptb" event={"ID":"b0e44383-9ccd-4abc-9cce-aab97cce1388","Type":"ContainerDied","Data":"e93aa8dcb65cb5085dc7f7cd9bbe91cf5fd98006535dba31bd46aa42e7804e4b"} Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.236021 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.367643 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-combined-ca-bundle\") pod \"b0e44383-9ccd-4abc-9cce-aab97cce1388\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.367895 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-config-data\") pod \"b0e44383-9ccd-4abc-9cce-aab97cce1388\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.367990 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-scripts\") pod \"b0e44383-9ccd-4abc-9cce-aab97cce1388\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.368040 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwgnb\" (UniqueName: \"kubernetes.io/projected/b0e44383-9ccd-4abc-9cce-aab97cce1388-kube-api-access-xwgnb\") pod \"b0e44383-9ccd-4abc-9cce-aab97cce1388\" (UID: \"b0e44383-9ccd-4abc-9cce-aab97cce1388\") " Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.377189 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-scripts" (OuterVolumeSpecName: "scripts") pod "b0e44383-9ccd-4abc-9cce-aab97cce1388" (UID: "b0e44383-9ccd-4abc-9cce-aab97cce1388"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.377269 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0e44383-9ccd-4abc-9cce-aab97cce1388-kube-api-access-xwgnb" (OuterVolumeSpecName: "kube-api-access-xwgnb") pod "b0e44383-9ccd-4abc-9cce-aab97cce1388" (UID: "b0e44383-9ccd-4abc-9cce-aab97cce1388"). InnerVolumeSpecName "kube-api-access-xwgnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.417207 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-config-data" (OuterVolumeSpecName: "config-data") pod "b0e44383-9ccd-4abc-9cce-aab97cce1388" (UID: "b0e44383-9ccd-4abc-9cce-aab97cce1388"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.421657 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0e44383-9ccd-4abc-9cce-aab97cce1388" (UID: "b0e44383-9ccd-4abc-9cce-aab97cce1388"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.470309 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.470357 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.470376 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwgnb\" (UniqueName: \"kubernetes.io/projected/b0e44383-9ccd-4abc-9cce-aab97cce1388-kube-api-access-xwgnb\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.470394 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0e44383-9ccd-4abc-9cce-aab97cce1388-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.785838 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dvptb" event={"ID":"b0e44383-9ccd-4abc-9cce-aab97cce1388","Type":"ContainerDied","Data":"d2ef157e9da9b845f709c14f3732f72e0803eeaca7c8a54f89a25c3704e2818d"} Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.786121 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dvptb" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.786149 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2ef157e9da9b845f709c14f3732f72e0803eeaca7c8a54f89a25c3704e2818d" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.946813 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 16:21:03 crc kubenswrapper[4764]: E1001 16:21:03.947459 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b789fde-2d6b-41ab-bdfb-8a3071d969f5" containerName="mariadb-account-create" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.947490 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b789fde-2d6b-41ab-bdfb-8a3071d969f5" containerName="mariadb-account-create" Oct 01 16:21:03 crc kubenswrapper[4764]: E1001 16:21:03.947511 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e44383-9ccd-4abc-9cce-aab97cce1388" containerName="nova-cell0-conductor-db-sync" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.947526 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e44383-9ccd-4abc-9cce-aab97cce1388" containerName="nova-cell0-conductor-db-sync" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.947897 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b789fde-2d6b-41ab-bdfb-8a3071d969f5" containerName="mariadb-account-create" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.947924 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0e44383-9ccd-4abc-9cce-aab97cce1388" containerName="nova-cell0-conductor-db-sync" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.949001 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.950936 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dhrd7" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.951091 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 16:21:03 crc kubenswrapper[4764]: I1001 16:21:03.959759 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 16:21:04 crc kubenswrapper[4764]: I1001 16:21:04.081378 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa\") " pod="openstack/nova-cell0-conductor-0" Oct 01 16:21:04 crc kubenswrapper[4764]: I1001 16:21:04.081425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa\") " pod="openstack/nova-cell0-conductor-0" Oct 01 16:21:04 crc kubenswrapper[4764]: I1001 16:21:04.081463 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqz48\" (UniqueName: \"kubernetes.io/projected/eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa-kube-api-access-zqz48\") pod \"nova-cell0-conductor-0\" (UID: \"eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa\") " pod="openstack/nova-cell0-conductor-0" Oct 01 16:21:04 crc kubenswrapper[4764]: I1001 16:21:04.183850 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqz48\" (UniqueName: \"kubernetes.io/projected/eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa-kube-api-access-zqz48\") pod \"nova-cell0-conductor-0\" (UID: \"eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa\") " pod="openstack/nova-cell0-conductor-0" Oct 01 16:21:04 crc kubenswrapper[4764]: I1001 16:21:04.184497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa\") " pod="openstack/nova-cell0-conductor-0" Oct 01 16:21:04 crc kubenswrapper[4764]: I1001 16:21:04.184616 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa\") " pod="openstack/nova-cell0-conductor-0" Oct 01 16:21:04 crc kubenswrapper[4764]: I1001 16:21:04.194686 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa\") " pod="openstack/nova-cell0-conductor-0" Oct 01 16:21:04 crc kubenswrapper[4764]: I1001 16:21:04.198164 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa\") " pod="openstack/nova-cell0-conductor-0" Oct 01 16:21:04 crc kubenswrapper[4764]: I1001 16:21:04.217502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqz48\" (UniqueName: \"kubernetes.io/projected/eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa-kube-api-access-zqz48\") pod \"nova-cell0-conductor-0\" (UID: \"eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa\") " pod="openstack/nova-cell0-conductor-0" Oct 01 16:21:04 crc kubenswrapper[4764]: I1001 16:21:04.267636 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 16:21:04 crc kubenswrapper[4764]: I1001 16:21:04.793406 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 16:21:04 crc kubenswrapper[4764]: W1001 16:21:04.796663 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb3c7b0f_1d91_4e76_9b15_3322d2dfd3fa.slice/crio-57aa0f6d6792ec5e512d78faf0d6bd9ef171c19c000cbe6bdb4ad26daf0c81eb WatchSource:0}: Error finding container 57aa0f6d6792ec5e512d78faf0d6bd9ef171c19c000cbe6bdb4ad26daf0c81eb: Status 404 returned error can't find the container with id 57aa0f6d6792ec5e512d78faf0d6bd9ef171c19c000cbe6bdb4ad26daf0c81eb Oct 01 16:21:05 crc kubenswrapper[4764]: I1001 16:21:05.816568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa","Type":"ContainerStarted","Data":"3cc9e31bcf330a6d4c231a95be10458c2070376da08332c47179d1740e09a39f"} Oct 01 16:21:05 crc kubenswrapper[4764]: I1001 16:21:05.817144 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 01 16:21:05 crc kubenswrapper[4764]: I1001 16:21:05.817160 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa","Type":"ContainerStarted","Data":"57aa0f6d6792ec5e512d78faf0d6bd9ef171c19c000cbe6bdb4ad26daf0c81eb"} Oct 01 16:21:05 crc kubenswrapper[4764]: I1001 16:21:05.849866 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.849842958 podStartE2EDuration="2.849842958s" podCreationTimestamp="2025-10-01 16:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:21:05.84382954 +0000 UTC m=+1128.843476415" watchObservedRunningTime="2025-10-01 16:21:05.849842958 +0000 UTC m=+1128.849489823" Oct 01 16:21:11 crc kubenswrapper[4764]: I1001 16:21:11.200812 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.314496 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.365761 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.366278 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0" containerName="kube-state-metrics" containerID="cri-o://2b3b93ad83c9b0feb3ca66b4088ea581e3c521c205141150b25c1136ba31fed3" gracePeriod=30 Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.836505 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.920713 4764 generic.go:334] "Generic (PLEG): container finished" podID="948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0" containerID="2b3b93ad83c9b0feb3ca66b4088ea581e3c521c205141150b25c1136ba31fed3" exitCode=2 Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.920754 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0","Type":"ContainerDied","Data":"2b3b93ad83c9b0feb3ca66b4088ea581e3c521c205141150b25c1136ba31fed3"} Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.920779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0","Type":"ContainerDied","Data":"4765fa5e059c57728c32adc65abc37cc92c17540b3a9194f4f472c1b2519905e"} Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.920794 4764 scope.go:117] "RemoveContainer" containerID="2b3b93ad83c9b0feb3ca66b4088ea581e3c521c205141150b25c1136ba31fed3" Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.920895 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.945064 4764 scope.go:117] "RemoveContainer" containerID="2b3b93ad83c9b0feb3ca66b4088ea581e3c521c205141150b25c1136ba31fed3" Oct 01 16:21:14 crc kubenswrapper[4764]: E1001 16:21:14.945404 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b3b93ad83c9b0feb3ca66b4088ea581e3c521c205141150b25c1136ba31fed3\": container with ID starting with 2b3b93ad83c9b0feb3ca66b4088ea581e3c521c205141150b25c1136ba31fed3 not found: ID does not exist" containerID="2b3b93ad83c9b0feb3ca66b4088ea581e3c521c205141150b25c1136ba31fed3" Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.945533 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b3b93ad83c9b0feb3ca66b4088ea581e3c521c205141150b25c1136ba31fed3"} err="failed to get container status \"2b3b93ad83c9b0feb3ca66b4088ea581e3c521c205141150b25c1136ba31fed3\": rpc error: code = NotFound desc = could not find container \"2b3b93ad83c9b0feb3ca66b4088ea581e3c521c205141150b25c1136ba31fed3\": container with ID starting with 2b3b93ad83c9b0feb3ca66b4088ea581e3c521c205141150b25c1136ba31fed3 not found: ID does not exist" Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.945941 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-sxv5x"] Oct 01 16:21:14 crc kubenswrapper[4764]: E1001 16:21:14.946321 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0" containerName="kube-state-metrics" Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.946337 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0" containerName="kube-state-metrics" Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.950320 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0" containerName="kube-state-metrics" Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.951001 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.953293 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.953449 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 01 16:21:14 crc kubenswrapper[4764]: I1001 16:21:14.969011 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-sxv5x"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.001910 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqg7p\" (UniqueName: \"kubernetes.io/projected/948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0-kube-api-access-qqg7p\") pod \"948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0\" (UID: \"948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0\") " Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.029312 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0-kube-api-access-qqg7p" (OuterVolumeSpecName: "kube-api-access-qqg7p") pod "948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0" (UID: "948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0"). InnerVolumeSpecName "kube-api-access-qqg7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.090032 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.091958 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.096537 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.098953 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.103449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-scripts\") pod \"nova-cell0-cell-mapping-sxv5x\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.103526 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sxv5x\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.103582 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgqzr\" (UniqueName: \"kubernetes.io/projected/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-kube-api-access-hgqzr\") pod \"nova-cell0-cell-mapping-sxv5x\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.103638 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-config-data\") pod \"nova-cell0-cell-mapping-sxv5x\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.103685 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqg7p\" (UniqueName: \"kubernetes.io/projected/948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0-kube-api-access-qqg7p\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.171479 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.172495 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.174646 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.189807 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.204811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgqzr\" (UniqueName: \"kubernetes.io/projected/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-kube-api-access-hgqzr\") pod \"nova-cell0-cell-mapping-sxv5x\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.204943 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-config-data\") pod \"nova-cell0-cell-mapping-sxv5x\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.204986 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29147115-0a7e-431e-9c9f-609ac6547ae7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.205033 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-scripts\") pod \"nova-cell0-cell-mapping-sxv5x\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.205087 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmh9t\" (UniqueName: \"kubernetes.io/projected/29147115-0a7e-431e-9c9f-609ac6547ae7-kube-api-access-nmh9t\") pod \"nova-api-0\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.205178 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29147115-0a7e-431e-9c9f-609ac6547ae7-config-data\") pod \"nova-api-0\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.205849 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sxv5x\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.206062 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29147115-0a7e-431e-9c9f-609ac6547ae7-logs\") pod \"nova-api-0\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.218164 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-config-data\") pod \"nova-cell0-cell-mapping-sxv5x\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.219758 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-scripts\") pod \"nova-cell0-cell-mapping-sxv5x\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.221449 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sxv5x\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.264411 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgqzr\" (UniqueName: \"kubernetes.io/projected/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-kube-api-access-hgqzr\") pod \"nova-cell0-cell-mapping-sxv5x\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.266465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.290906 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.299151 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.302270 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.305592 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.307768 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqhqd\" (UniqueName: \"kubernetes.io/projected/d1e50ddd-40e7-470a-8454-aab5337c9469-kube-api-access-wqhqd\") pod \"nova-scheduler-0\" (UID: \"d1e50ddd-40e7-470a-8454-aab5337c9469\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.307819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83bf125f-bb40-40a8-911c-520f9ce115f0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.307848 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29147115-0a7e-431e-9c9f-609ac6547ae7-logs\") pod \"nova-api-0\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.307872 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83bf125f-bb40-40a8-911c-520f9ce115f0-config-data\") pod \"nova-metadata-0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.307938 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e50ddd-40e7-470a-8454-aab5337c9469-config-data\") pod \"nova-scheduler-0\" (UID: \"d1e50ddd-40e7-470a-8454-aab5337c9469\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.307953 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83bf125f-bb40-40a8-911c-520f9ce115f0-logs\") pod \"nova-metadata-0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.307973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29147115-0a7e-431e-9c9f-609ac6547ae7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.307995 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljbtf\" (UniqueName: \"kubernetes.io/projected/83bf125f-bb40-40a8-911c-520f9ce115f0-kube-api-access-ljbtf\") pod \"nova-metadata-0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.308029 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmh9t\" (UniqueName: \"kubernetes.io/projected/29147115-0a7e-431e-9c9f-609ac6547ae7-kube-api-access-nmh9t\") pod \"nova-api-0\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.308115 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e50ddd-40e7-470a-8454-aab5337c9469-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d1e50ddd-40e7-470a-8454-aab5337c9469\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.308141 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29147115-0a7e-431e-9c9f-609ac6547ae7-config-data\") pod \"nova-api-0\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.308923 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.312500 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29147115-0a7e-431e-9c9f-609ac6547ae7-logs\") pod \"nova-api-0\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.313815 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29147115-0a7e-431e-9c9f-609ac6547ae7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.329277 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29147115-0a7e-431e-9c9f-609ac6547ae7-config-data\") pod \"nova-api-0\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.340622 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.344796 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.345867 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.355325 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmh9t\" (UniqueName: \"kubernetes.io/projected/29147115-0a7e-431e-9c9f-609ac6547ae7-kube-api-access-nmh9t\") pod \"nova-api-0\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.360239 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.360425 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.410212 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e50ddd-40e7-470a-8454-aab5337c9469-config-data\") pod \"nova-scheduler-0\" (UID: \"d1e50ddd-40e7-470a-8454-aab5337c9469\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.410260 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83bf125f-bb40-40a8-911c-520f9ce115f0-logs\") pod \"nova-metadata-0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.410294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljbtf\" (UniqueName: \"kubernetes.io/projected/83bf125f-bb40-40a8-911c-520f9ce115f0-kube-api-access-ljbtf\") pod \"nova-metadata-0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.410334 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a83996-de2f-4abe-a075-8c0c2191eb7b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a1a83996-de2f-4abe-a075-8c0c2191eb7b\") " pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.410369 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e50ddd-40e7-470a-8454-aab5337c9469-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d1e50ddd-40e7-470a-8454-aab5337c9469\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.410483 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqhqd\" (UniqueName: \"kubernetes.io/projected/d1e50ddd-40e7-470a-8454-aab5337c9469-kube-api-access-wqhqd\") pod \"nova-scheduler-0\" (UID: \"d1e50ddd-40e7-470a-8454-aab5337c9469\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.410506 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1a83996-de2f-4abe-a075-8c0c2191eb7b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a1a83996-de2f-4abe-a075-8c0c2191eb7b\") " pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.410529 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a1a83996-de2f-4abe-a075-8c0c2191eb7b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a1a83996-de2f-4abe-a075-8c0c2191eb7b\") " pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.410561 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83bf125f-bb40-40a8-911c-520f9ce115f0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.410599 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83bf125f-bb40-40a8-911c-520f9ce115f0-config-data\") pod \"nova-metadata-0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.410642 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl8tm\" (UniqueName: \"kubernetes.io/projected/a1a83996-de2f-4abe-a075-8c0c2191eb7b-kube-api-access-pl8tm\") pod \"kube-state-metrics-0\" (UID: \"a1a83996-de2f-4abe-a075-8c0c2191eb7b\") " pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.410740 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83bf125f-bb40-40a8-911c-520f9ce115f0-logs\") pod \"nova-metadata-0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.411079 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.427372 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83bf125f-bb40-40a8-911c-520f9ce115f0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.434170 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljbtf\" (UniqueName: \"kubernetes.io/projected/83bf125f-bb40-40a8-911c-520f9ce115f0-kube-api-access-ljbtf\") pod \"nova-metadata-0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.443889 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqhqd\" (UniqueName: \"kubernetes.io/projected/d1e50ddd-40e7-470a-8454-aab5337c9469-kube-api-access-wqhqd\") pod \"nova-scheduler-0\" (UID: \"d1e50ddd-40e7-470a-8454-aab5337c9469\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.467551 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83bf125f-bb40-40a8-911c-520f9ce115f0-config-data\") pod \"nova-metadata-0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.467691 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e50ddd-40e7-470a-8454-aab5337c9469-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d1e50ddd-40e7-470a-8454-aab5337c9469\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.468830 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e50ddd-40e7-470a-8454-aab5337c9469-config-data\") pod \"nova-scheduler-0\" (UID: \"d1e50ddd-40e7-470a-8454-aab5337c9469\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.505456 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.507243 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.513507 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl8tm\" (UniqueName: \"kubernetes.io/projected/a1a83996-de2f-4abe-a075-8c0c2191eb7b-kube-api-access-pl8tm\") pod \"kube-state-metrics-0\" (UID: \"a1a83996-de2f-4abe-a075-8c0c2191eb7b\") " pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.513590 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a83996-de2f-4abe-a075-8c0c2191eb7b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a1a83996-de2f-4abe-a075-8c0c2191eb7b\") " pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.513636 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1a83996-de2f-4abe-a075-8c0c2191eb7b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a1a83996-de2f-4abe-a075-8c0c2191eb7b\") " pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.513651 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a1a83996-de2f-4abe-a075-8c0c2191eb7b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a1a83996-de2f-4abe-a075-8c0c2191eb7b\") " pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.527402 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.528729 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.535470 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a1a83996-de2f-4abe-a075-8c0c2191eb7b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a1a83996-de2f-4abe-a075-8c0c2191eb7b\") " pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.544931 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a83996-de2f-4abe-a075-8c0c2191eb7b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a1a83996-de2f-4abe-a075-8c0c2191eb7b\") " pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.548504 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.553909 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1a83996-de2f-4abe-a075-8c0c2191eb7b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a1a83996-de2f-4abe-a075-8c0c2191eb7b\") " pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.557645 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl8tm\" (UniqueName: \"kubernetes.io/projected/a1a83996-de2f-4abe-a075-8c0c2191eb7b-kube-api-access-pl8tm\") pod \"kube-state-metrics-0\" (UID: \"a1a83996-de2f-4abe-a075-8c0c2191eb7b\") " pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.560417 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.598084 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-zmqd5"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.614383 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.633551 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.641106 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-zmqd5"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.691387 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.720883 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-config\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.721206 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.721267 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.721298 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.721350 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8bmm\" (UniqueName: \"kubernetes.io/projected/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-kube-api-access-f8bmm\") pod \"nova-cell1-novncproxy-0\" (UID: \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.721446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-dns-svc\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.721506 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdfgl\" (UniqueName: \"kubernetes.io/projected/75884291-058a-479e-9c6f-9880d64900fe-kube-api-access-mdfgl\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.721527 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.785000 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0" path="/var/lib/kubelet/pods/948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0/volumes" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.823328 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-dns-svc\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.823384 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdfgl\" (UniqueName: \"kubernetes.io/projected/75884291-058a-479e-9c6f-9880d64900fe-kube-api-access-mdfgl\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.823418 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.823486 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-config\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.823511 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.823542 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.823574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.823605 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8bmm\" (UniqueName: \"kubernetes.io/projected/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-kube-api-access-f8bmm\") pod \"nova-cell1-novncproxy-0\" (UID: \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.825898 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.826329 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-dns-svc\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.826583 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.827090 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-config\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.831565 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.841607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8bmm\" (UniqueName: \"kubernetes.io/projected/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-kube-api-access-f8bmm\") pod \"nova-cell1-novncproxy-0\" (UID: \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.844582 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdfgl\" (UniqueName: \"kubernetes.io/projected/75884291-058a-479e-9c6f-9880d64900fe-kube-api-access-mdfgl\") pod \"dnsmasq-dns-566b5b7845-zmqd5\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.846794 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.880590 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-sxv5x"] Oct 01 16:21:15 crc kubenswrapper[4764]: I1001 16:21:15.938704 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sxv5x" event={"ID":"1b99c5e9-3966-41d0-af69-2d0eb7a86d25","Type":"ContainerStarted","Data":"80c6d8c2937ce2b8131cc5d994bead023ddaeb71aac16a87a30a4261f3c62643"} Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.014747 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.015746 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.024392 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.031804 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r2sz8"] Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.033166 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.038275 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.038832 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.042902 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r2sz8"] Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.134306 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r2sz8\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.134350 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvvtz\" (UniqueName: \"kubernetes.io/projected/c6bfb577-6442-4db0-b962-a89441eb7a9c-kube-api-access-cvvtz\") pod \"nova-cell1-conductor-db-sync-r2sz8\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.134516 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-scripts\") pod \"nova-cell1-conductor-db-sync-r2sz8\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.134547 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-config-data\") pod \"nova-cell1-conductor-db-sync-r2sz8\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.149477 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.149782 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="ceilometer-central-agent" containerID="cri-o://58e9d540739bc1d248198600d53c93fe3ad611c000c70a55d8dff8637923c4e0" gracePeriod=30 Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.149847 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="sg-core" containerID="cri-o://03d3cc5a13f44af7e7ac35d0e927c95fd1668571e56adf6d52ffd6a49939ff37" gracePeriod=30 Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.149891 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="ceilometer-notification-agent" containerID="cri-o://b5aefc8d50296d655a2870014ef6aa207b8edda0b6696089503036bb8f852194" gracePeriod=30 Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.150026 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="proxy-httpd" containerID="cri-o://d7989c3875da0c28d8556c4cebc844efab4a1ac5324f5f77e0314f8d7ba22760" gracePeriod=30 Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.179347 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.236468 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-scripts\") pod \"nova-cell1-conductor-db-sync-r2sz8\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.236507 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-config-data\") pod \"nova-cell1-conductor-db-sync-r2sz8\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.236532 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r2sz8\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.236549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvvtz\" (UniqueName: \"kubernetes.io/projected/c6bfb577-6442-4db0-b962-a89441eb7a9c-kube-api-access-cvvtz\") pod \"nova-cell1-conductor-db-sync-r2sz8\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.241290 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-scripts\") pod \"nova-cell1-conductor-db-sync-r2sz8\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.241701 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.243636 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-config-data\") pod \"nova-cell1-conductor-db-sync-r2sz8\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.244672 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r2sz8\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.255009 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvvtz\" (UniqueName: \"kubernetes.io/projected/c6bfb577-6442-4db0-b962-a89441eb7a9c-kube-api-access-cvvtz\") pod \"nova-cell1-conductor-db-sync-r2sz8\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.275875 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.359283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.559591 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 16:21:16 crc kubenswrapper[4764]: W1001 16:21:16.585880 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8eb9e30_0cde_4a8a_86fa_9201a5efe701.slice/crio-97dfa3be2f8ece6cd4f347c3cc46d148740c40c192aa83d83451a9eb0aae88a1 WatchSource:0}: Error finding container 97dfa3be2f8ece6cd4f347c3cc46d148740c40c192aa83d83451a9eb0aae88a1: Status 404 returned error can't find the container with id 97dfa3be2f8ece6cd4f347c3cc46d148740c40c192aa83d83451a9eb0aae88a1 Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.615585 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-zmqd5"] Oct 01 16:21:16 crc kubenswrapper[4764]: W1001 16:21:16.626615 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75884291_058a_479e_9c6f_9880d64900fe.slice/crio-476a656e1d674883d9ddbb8552292d2d747be09387f39173fa1ce7af97350a4a WatchSource:0}: Error finding container 476a656e1d674883d9ddbb8552292d2d747be09387f39173fa1ce7af97350a4a: Status 404 returned error can't find the container with id 476a656e1d674883d9ddbb8552292d2d747be09387f39173fa1ce7af97350a4a Oct 01 16:21:16 crc kubenswrapper[4764]: W1001 16:21:16.832883 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6bfb577_6442_4db0_b962_a89441eb7a9c.slice/crio-23e3e95e00f41a25f706a48614a24cc01267c2a0ae032b3c04014b2c045ddf9d WatchSource:0}: Error finding container 23e3e95e00f41a25f706a48614a24cc01267c2a0ae032b3c04014b2c045ddf9d: Status 404 returned error can't find the container with id 23e3e95e00f41a25f706a48614a24cc01267c2a0ae032b3c04014b2c045ddf9d Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.834683 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r2sz8"] Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.949356 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83bf125f-bb40-40a8-911c-520f9ce115f0","Type":"ContainerStarted","Data":"0c3918192c81639ac5e1255519b8389c8e2a6d85458d0b01efbbc29266f4be15"} Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.952637 4764 generic.go:334] "Generic (PLEG): container finished" podID="75884291-058a-479e-9c6f-9880d64900fe" containerID="b11f1651486853ec9e9cd33d78cb0721eb7ac54b38c8b8f9380a2a3fd6873276" exitCode=0 Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.952687 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" event={"ID":"75884291-058a-479e-9c6f-9880d64900fe","Type":"ContainerDied","Data":"b11f1651486853ec9e9cd33d78cb0721eb7ac54b38c8b8f9380a2a3fd6873276"} Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.952707 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" event={"ID":"75884291-058a-479e-9c6f-9880d64900fe","Type":"ContainerStarted","Data":"476a656e1d674883d9ddbb8552292d2d747be09387f39173fa1ce7af97350a4a"} Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.960296 4764 generic.go:334] "Generic (PLEG): container finished" podID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerID="d7989c3875da0c28d8556c4cebc844efab4a1ac5324f5f77e0314f8d7ba22760" exitCode=0 Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.960326 4764 generic.go:334] "Generic (PLEG): container finished" podID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerID="03d3cc5a13f44af7e7ac35d0e927c95fd1668571e56adf6d52ffd6a49939ff37" exitCode=2 Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.960337 4764 generic.go:334] "Generic (PLEG): container finished" podID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerID="58e9d540739bc1d248198600d53c93fe3ad611c000c70a55d8dff8637923c4e0" exitCode=0 Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.960375 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605b1c9-ed82-4e24-b150-f966fd09d0f2","Type":"ContainerDied","Data":"d7989c3875da0c28d8556c4cebc844efab4a1ac5324f5f77e0314f8d7ba22760"} Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.960400 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605b1c9-ed82-4e24-b150-f966fd09d0f2","Type":"ContainerDied","Data":"03d3cc5a13f44af7e7ac35d0e927c95fd1668571e56adf6d52ffd6a49939ff37"} Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.960412 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605b1c9-ed82-4e24-b150-f966fd09d0f2","Type":"ContainerDied","Data":"58e9d540739bc1d248198600d53c93fe3ad611c000c70a55d8dff8637923c4e0"} Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.961848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29147115-0a7e-431e-9c9f-609ac6547ae7","Type":"ContainerStarted","Data":"a555e614f897fb5a673d835b34248541845c1f609776bf08f25181ad87d87858"} Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.962826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d1e50ddd-40e7-470a-8454-aab5337c9469","Type":"ContainerStarted","Data":"77a1a20e88a0615de7897a18580d54a4b97a0938915fe0f2857c47402d41e6c0"} Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.964608 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sxv5x" event={"ID":"1b99c5e9-3966-41d0-af69-2d0eb7a86d25","Type":"ContainerStarted","Data":"1a80f7f6f8927cfdb5534a5d0fc06b5b6fbfdd26e2766c8df9b54722bcec1a8d"} Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.974584 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r2sz8" event={"ID":"c6bfb577-6442-4db0-b962-a89441eb7a9c","Type":"ContainerStarted","Data":"23e3e95e00f41a25f706a48614a24cc01267c2a0ae032b3c04014b2c045ddf9d"} Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.978755 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a1a83996-de2f-4abe-a075-8c0c2191eb7b","Type":"ContainerStarted","Data":"f1b1cf4ae37226bb3a6336a8e854a83ac0e3c78c6fbc859c36ec583c8b08b43b"} Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.978799 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a1a83996-de2f-4abe-a075-8c0c2191eb7b","Type":"ContainerStarted","Data":"fee46ec8b0590ec611f404eea5389750d3fa5bb1fe9da5240208a845e382d45f"} Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.979028 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 16:21:16 crc kubenswrapper[4764]: I1001 16:21:16.979971 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b8eb9e30-0cde-4a8a-86fa-9201a5efe701","Type":"ContainerStarted","Data":"97dfa3be2f8ece6cd4f347c3cc46d148740c40c192aa83d83451a9eb0aae88a1"} Oct 01 16:21:17 crc kubenswrapper[4764]: I1001 16:21:17.016335 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-sxv5x" podStartSLOduration=3.016313626 podStartE2EDuration="3.016313626s" podCreationTimestamp="2025-10-01 16:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:21:16.984332488 +0000 UTC m=+1139.983979323" watchObservedRunningTime="2025-10-01 16:21:17.016313626 +0000 UTC m=+1140.015960461" Oct 01 16:21:17 crc kubenswrapper[4764]: I1001 16:21:17.023429 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.6218791449999999 podStartE2EDuration="2.023413011s" podCreationTimestamp="2025-10-01 16:21:15 +0000 UTC" firstStartedPulling="2025-10-01 16:21:16.259370998 +0000 UTC m=+1139.259017833" lastFinishedPulling="2025-10-01 16:21:16.660904864 +0000 UTC m=+1139.660551699" observedRunningTime="2025-10-01 16:21:17.002004884 +0000 UTC m=+1140.001651709" watchObservedRunningTime="2025-10-01 16:21:17.023413011 +0000 UTC m=+1140.023059846" Oct 01 16:21:17 crc kubenswrapper[4764]: I1001 16:21:17.994700 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r2sz8" event={"ID":"c6bfb577-6442-4db0-b962-a89441eb7a9c","Type":"ContainerStarted","Data":"e164c82ca96c5f7cedee0ab66c69a19c4a1fe02bcb5a45e9eb658c45143c73f3"} Oct 01 16:21:18 crc kubenswrapper[4764]: I1001 16:21:18.010731 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" event={"ID":"75884291-058a-479e-9c6f-9880d64900fe","Type":"ContainerStarted","Data":"45aa2458a483632cefb43d43e7a67e090d8da9c60c74ff2d5fa6f1185dd8b6be"} Oct 01 16:21:18 crc kubenswrapper[4764]: I1001 16:21:18.018969 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-r2sz8" podStartSLOduration=2.018946254 podStartE2EDuration="2.018946254s" podCreationTimestamp="2025-10-01 16:21:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:21:18.015456749 +0000 UTC m=+1141.015103584" watchObservedRunningTime="2025-10-01 16:21:18.018946254 +0000 UTC m=+1141.018593089" Oct 01 16:21:18 crc kubenswrapper[4764]: I1001 16:21:18.032976 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" podStartSLOduration=3.032951629 podStartE2EDuration="3.032951629s" podCreationTimestamp="2025-10-01 16:21:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:21:18.031399371 +0000 UTC m=+1141.031046206" watchObservedRunningTime="2025-10-01 16:21:18.032951629 +0000 UTC m=+1141.032598454" Oct 01 16:21:18 crc kubenswrapper[4764]: I1001 16:21:18.720078 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 16:21:18 crc kubenswrapper[4764]: I1001 16:21:18.729699 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:19 crc kubenswrapper[4764]: I1001 16:21:19.029900 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29147115-0a7e-431e-9c9f-609ac6547ae7","Type":"ContainerStarted","Data":"5e2855d6c89e7d34ad2846a73c8a485eb15b8dba6a1d8db70fe39daca0369ab0"} Oct 01 16:21:19 crc kubenswrapper[4764]: I1001 16:21:19.030910 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:19 crc kubenswrapper[4764]: I1001 16:21:19.704297 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="948ddca7-2639-4b9f-8ff5-fe5f8f4f49e0" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.038767 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29147115-0a7e-431e-9c9f-609ac6547ae7","Type":"ContainerStarted","Data":"03cdb772aa1c9e95835840fb1b5823a012e11cb175b6bd8f4750ce44e48aa7fa"} Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.040439 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b8eb9e30-0cde-4a8a-86fa-9201a5efe701","Type":"ContainerStarted","Data":"f723856086bad6af723892a7d1c5fc6d1ab10e5e8438cf57b39976db9758d981"} Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.040561 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b8eb9e30-0cde-4a8a-86fa-9201a5efe701" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f723856086bad6af723892a7d1c5fc6d1ab10e5e8438cf57b39976db9758d981" gracePeriod=30 Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.042783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83bf125f-bb40-40a8-911c-520f9ce115f0","Type":"ContainerStarted","Data":"969650fbd38926bafbff3eb4589cdb2b3989f17bd41714c25fb6cf035175e154"} Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.042842 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="83bf125f-bb40-40a8-911c-520f9ce115f0" containerName="nova-metadata-log" containerID="cri-o://c1cd30d23d06154d4010ae1182be8357d00a902ab2173434b4d3298dd59accf4" gracePeriod=30 Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.042832 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83bf125f-bb40-40a8-911c-520f9ce115f0","Type":"ContainerStarted","Data":"c1cd30d23d06154d4010ae1182be8357d00a902ab2173434b4d3298dd59accf4"} Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.042851 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="83bf125f-bb40-40a8-911c-520f9ce115f0" containerName="nova-metadata-metadata" containerID="cri-o://969650fbd38926bafbff3eb4589cdb2b3989f17bd41714c25fb6cf035175e154" gracePeriod=30 Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.047627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d1e50ddd-40e7-470a-8454-aab5337c9469","Type":"ContainerStarted","Data":"e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98"} Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.068965 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.621859917 podStartE2EDuration="5.068948704s" podCreationTimestamp="2025-10-01 16:21:15 +0000 UTC" firstStartedPulling="2025-10-01 16:21:16.037850562 +0000 UTC m=+1139.037497397" lastFinishedPulling="2025-10-01 16:21:18.484939349 +0000 UTC m=+1141.484586184" observedRunningTime="2025-10-01 16:21:20.062470164 +0000 UTC m=+1143.062117009" watchObservedRunningTime="2025-10-01 16:21:20.068948704 +0000 UTC m=+1143.068595539" Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.083522 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.84086049 podStartE2EDuration="5.083509492s" podCreationTimestamp="2025-10-01 16:21:15 +0000 UTC" firstStartedPulling="2025-10-01 16:21:16.247202338 +0000 UTC m=+1139.246849163" lastFinishedPulling="2025-10-01 16:21:18.48985133 +0000 UTC m=+1141.489498165" observedRunningTime="2025-10-01 16:21:20.079184856 +0000 UTC m=+1143.078831701" watchObservedRunningTime="2025-10-01 16:21:20.083509492 +0000 UTC m=+1143.083156327" Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.096471 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.224064041 podStartE2EDuration="5.09645655s" podCreationTimestamp="2025-10-01 16:21:15 +0000 UTC" firstStartedPulling="2025-10-01 16:21:16.599164634 +0000 UTC m=+1139.598811459" lastFinishedPulling="2025-10-01 16:21:19.471557133 +0000 UTC m=+1142.471203968" observedRunningTime="2025-10-01 16:21:20.093328344 +0000 UTC m=+1143.092975179" watchObservedRunningTime="2025-10-01 16:21:20.09645655 +0000 UTC m=+1143.096103375" Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.115949 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.834057671 podStartE2EDuration="5.11593334s" podCreationTimestamp="2025-10-01 16:21:15 +0000 UTC" firstStartedPulling="2025-10-01 16:21:16.2050566 +0000 UTC m=+1139.204703435" lastFinishedPulling="2025-10-01 16:21:18.486932269 +0000 UTC m=+1141.486579104" observedRunningTime="2025-10-01 16:21:20.108534598 +0000 UTC m=+1143.108181423" watchObservedRunningTime="2025-10-01 16:21:20.11593334 +0000 UTC m=+1143.115580175" Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.508282 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.640602 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 16:21:20 crc kubenswrapper[4764]: I1001 16:21:20.640666 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.016234 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.074739 4764 generic.go:334] "Generic (PLEG): container finished" podID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerID="b5aefc8d50296d655a2870014ef6aa207b8edda0b6696089503036bb8f852194" exitCode=0 Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.074819 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605b1c9-ed82-4e24-b150-f966fd09d0f2","Type":"ContainerDied","Data":"b5aefc8d50296d655a2870014ef6aa207b8edda0b6696089503036bb8f852194"} Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.077232 4764 generic.go:334] "Generic (PLEG): container finished" podID="83bf125f-bb40-40a8-911c-520f9ce115f0" containerID="969650fbd38926bafbff3eb4589cdb2b3989f17bd41714c25fb6cf035175e154" exitCode=0 Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.077259 4764 generic.go:334] "Generic (PLEG): container finished" podID="83bf125f-bb40-40a8-911c-520f9ce115f0" containerID="c1cd30d23d06154d4010ae1182be8357d00a902ab2173434b4d3298dd59accf4" exitCode=143 Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.079874 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83bf125f-bb40-40a8-911c-520f9ce115f0","Type":"ContainerDied","Data":"969650fbd38926bafbff3eb4589cdb2b3989f17bd41714c25fb6cf035175e154"} Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.081201 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83bf125f-bb40-40a8-911c-520f9ce115f0","Type":"ContainerDied","Data":"c1cd30d23d06154d4010ae1182be8357d00a902ab2173434b4d3298dd59accf4"} Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.132371 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.244927 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83bf125f-bb40-40a8-911c-520f9ce115f0-logs\") pod \"83bf125f-bb40-40a8-911c-520f9ce115f0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.245058 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83bf125f-bb40-40a8-911c-520f9ce115f0-combined-ca-bundle\") pod \"83bf125f-bb40-40a8-911c-520f9ce115f0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.245108 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljbtf\" (UniqueName: \"kubernetes.io/projected/83bf125f-bb40-40a8-911c-520f9ce115f0-kube-api-access-ljbtf\") pod \"83bf125f-bb40-40a8-911c-520f9ce115f0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.245309 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83bf125f-bb40-40a8-911c-520f9ce115f0-config-data\") pod \"83bf125f-bb40-40a8-911c-520f9ce115f0\" (UID: \"83bf125f-bb40-40a8-911c-520f9ce115f0\") " Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.245589 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83bf125f-bb40-40a8-911c-520f9ce115f0-logs" (OuterVolumeSpecName: "logs") pod "83bf125f-bb40-40a8-911c-520f9ce115f0" (UID: "83bf125f-bb40-40a8-911c-520f9ce115f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.245854 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83bf125f-bb40-40a8-911c-520f9ce115f0-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.269182 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83bf125f-bb40-40a8-911c-520f9ce115f0-kube-api-access-ljbtf" (OuterVolumeSpecName: "kube-api-access-ljbtf") pod "83bf125f-bb40-40a8-911c-520f9ce115f0" (UID: "83bf125f-bb40-40a8-911c-520f9ce115f0"). InnerVolumeSpecName "kube-api-access-ljbtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.281213 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83bf125f-bb40-40a8-911c-520f9ce115f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83bf125f-bb40-40a8-911c-520f9ce115f0" (UID: "83bf125f-bb40-40a8-911c-520f9ce115f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.282946 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83bf125f-bb40-40a8-911c-520f9ce115f0-config-data" (OuterVolumeSpecName: "config-data") pod "83bf125f-bb40-40a8-911c-520f9ce115f0" (UID: "83bf125f-bb40-40a8-911c-520f9ce115f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.347220 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83bf125f-bb40-40a8-911c-520f9ce115f0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.347247 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83bf125f-bb40-40a8-911c-520f9ce115f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.347258 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljbtf\" (UniqueName: \"kubernetes.io/projected/83bf125f-bb40-40a8-911c-520f9ce115f0-kube-api-access-ljbtf\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.347962 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.448748 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605b1c9-ed82-4e24-b150-f966fd09d0f2-run-httpd\") pod \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.448842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkt5j\" (UniqueName: \"kubernetes.io/projected/9605b1c9-ed82-4e24-b150-f966fd09d0f2-kube-api-access-hkt5j\") pod \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.448896 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-scripts\") pod \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.448925 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605b1c9-ed82-4e24-b150-f966fd09d0f2-log-httpd\") pod \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.448964 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-sg-core-conf-yaml\") pod \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.449019 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-config-data\") pod \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.449121 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9605b1c9-ed82-4e24-b150-f966fd09d0f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9605b1c9-ed82-4e24-b150-f966fd09d0f2" (UID: "9605b1c9-ed82-4e24-b150-f966fd09d0f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.449787 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9605b1c9-ed82-4e24-b150-f966fd09d0f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9605b1c9-ed82-4e24-b150-f966fd09d0f2" (UID: "9605b1c9-ed82-4e24-b150-f966fd09d0f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.449896 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-combined-ca-bundle\") pod \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\" (UID: \"9605b1c9-ed82-4e24-b150-f966fd09d0f2\") " Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.450684 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605b1c9-ed82-4e24-b150-f966fd09d0f2-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.450696 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9605b1c9-ed82-4e24-b150-f966fd09d0f2-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.454664 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9605b1c9-ed82-4e24-b150-f966fd09d0f2-kube-api-access-hkt5j" (OuterVolumeSpecName: "kube-api-access-hkt5j") pod "9605b1c9-ed82-4e24-b150-f966fd09d0f2" (UID: "9605b1c9-ed82-4e24-b150-f966fd09d0f2"). InnerVolumeSpecName "kube-api-access-hkt5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.457212 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-scripts" (OuterVolumeSpecName: "scripts") pod "9605b1c9-ed82-4e24-b150-f966fd09d0f2" (UID: "9605b1c9-ed82-4e24-b150-f966fd09d0f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.473769 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9605b1c9-ed82-4e24-b150-f966fd09d0f2" (UID: "9605b1c9-ed82-4e24-b150-f966fd09d0f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.539146 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9605b1c9-ed82-4e24-b150-f966fd09d0f2" (UID: "9605b1c9-ed82-4e24-b150-f966fd09d0f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.554120 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkt5j\" (UniqueName: \"kubernetes.io/projected/9605b1c9-ed82-4e24-b150-f966fd09d0f2-kube-api-access-hkt5j\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.554147 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.554157 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.554164 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.561361 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-config-data" (OuterVolumeSpecName: "config-data") pod "9605b1c9-ed82-4e24-b150-f966fd09d0f2" (UID: "9605b1c9-ed82-4e24-b150-f966fd09d0f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:21 crc kubenswrapper[4764]: I1001 16:21:21.655807 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9605b1c9-ed82-4e24-b150-f966fd09d0f2-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.093152 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"83bf125f-bb40-40a8-911c-520f9ce115f0","Type":"ContainerDied","Data":"0c3918192c81639ac5e1255519b8389c8e2a6d85458d0b01efbbc29266f4be15"} Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.093298 4764 scope.go:117] "RemoveContainer" containerID="969650fbd38926bafbff3eb4589cdb2b3989f17bd41714c25fb6cf035175e154" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.096435 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.098889 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.098950 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9605b1c9-ed82-4e24-b150-f966fd09d0f2","Type":"ContainerDied","Data":"848cb876ceb49c147edb0ac3a1c2c3f694c01caf1df581b2b28eb861e68ef40a"} Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.125732 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.136330 4764 scope.go:117] "RemoveContainer" containerID="c1cd30d23d06154d4010ae1182be8357d00a902ab2173434b4d3298dd59accf4" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.139121 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.151212 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.163337 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.171385 4764 scope.go:117] "RemoveContainer" containerID="d7989c3875da0c28d8556c4cebc844efab4a1ac5324f5f77e0314f8d7ba22760" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.171508 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:22 crc kubenswrapper[4764]: E1001 16:21:22.171874 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="ceilometer-central-agent" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.171886 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="ceilometer-central-agent" Oct 01 16:21:22 crc kubenswrapper[4764]: E1001 16:21:22.171898 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="proxy-httpd" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.171904 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="proxy-httpd" Oct 01 16:21:22 crc kubenswrapper[4764]: E1001 16:21:22.171915 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83bf125f-bb40-40a8-911c-520f9ce115f0" containerName="nova-metadata-metadata" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.171921 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="83bf125f-bb40-40a8-911c-520f9ce115f0" containerName="nova-metadata-metadata" Oct 01 16:21:22 crc kubenswrapper[4764]: E1001 16:21:22.171932 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="ceilometer-notification-agent" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.171938 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="ceilometer-notification-agent" Oct 01 16:21:22 crc kubenswrapper[4764]: E1001 16:21:22.171951 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83bf125f-bb40-40a8-911c-520f9ce115f0" containerName="nova-metadata-log" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.171957 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="83bf125f-bb40-40a8-911c-520f9ce115f0" containerName="nova-metadata-log" Oct 01 16:21:22 crc kubenswrapper[4764]: E1001 16:21:22.171968 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="sg-core" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.171973 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="sg-core" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.172150 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="sg-core" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.172160 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="ceilometer-notification-agent" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.172172 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="ceilometer-central-agent" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.172187 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="83bf125f-bb40-40a8-911c-520f9ce115f0" containerName="nova-metadata-log" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.172196 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" containerName="proxy-httpd" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.172204 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="83bf125f-bb40-40a8-911c-520f9ce115f0" containerName="nova-metadata-metadata" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.173140 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.175260 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.175779 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.177367 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.241249 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.245227 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.256434 4764 scope.go:117] "RemoveContainer" containerID="03d3cc5a13f44af7e7ac35d0e927c95fd1668571e56adf6d52ffd6a49939ff37" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.258557 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.258780 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.258939 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.266153 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.269195 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-config-data\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.269408 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-logs\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.269657 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.269808 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.269956 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bkjt\" (UniqueName: \"kubernetes.io/projected/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-kube-api-access-5bkjt\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.289947 4764 scope.go:117] "RemoveContainer" containerID="b5aefc8d50296d655a2870014ef6aa207b8edda0b6696089503036bb8f852194" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.307359 4764 scope.go:117] "RemoveContainer" containerID="58e9d540739bc1d248198600d53c93fe3ad611c000c70a55d8dff8637923c4e0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.371967 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs2kv\" (UniqueName: \"kubernetes.io/projected/c8729af9-a340-4aaf-8954-eb497e3c8d3d-kube-api-access-cs2kv\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.372008 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.372132 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.372319 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.372371 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8729af9-a340-4aaf-8954-eb497e3c8d3d-log-httpd\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.372450 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.372498 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-config-data\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.372580 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bkjt\" (UniqueName: \"kubernetes.io/projected/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-kube-api-access-5bkjt\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.372691 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-config-data\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.372736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-logs\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.372800 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8729af9-a340-4aaf-8954-eb497e3c8d3d-run-httpd\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.372899 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.372964 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-scripts\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.373384 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-logs\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.377271 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.379419 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.390926 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bkjt\" (UniqueName: \"kubernetes.io/projected/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-kube-api-access-5bkjt\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.392601 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-config-data\") pod \"nova-metadata-0\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.474860 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8729af9-a340-4aaf-8954-eb497e3c8d3d-log-httpd\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.474919 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.474961 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-config-data\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.475089 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8729af9-a340-4aaf-8954-eb497e3c8d3d-run-httpd\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.475130 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.475164 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-scripts\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.475230 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs2kv\" (UniqueName: \"kubernetes.io/projected/c8729af9-a340-4aaf-8954-eb497e3c8d3d-kube-api-access-cs2kv\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.475260 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.475741 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8729af9-a340-4aaf-8954-eb497e3c8d3d-log-httpd\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.475799 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8729af9-a340-4aaf-8954-eb497e3c8d3d-run-httpd\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.479468 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.479601 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.479935 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-scripts\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.480722 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-config-data\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.481032 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.500877 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs2kv\" (UniqueName: \"kubernetes.io/projected/c8729af9-a340-4aaf-8954-eb497e3c8d3d-kube-api-access-cs2kv\") pod \"ceilometer-0\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " pod="openstack/ceilometer-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.541178 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:21:22 crc kubenswrapper[4764]: I1001 16:21:22.583677 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:21:23 crc kubenswrapper[4764]: I1001 16:21:23.032261 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:23 crc kubenswrapper[4764]: I1001 16:21:23.041170 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:21:23 crc kubenswrapper[4764]: W1001 16:21:23.051725 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8729af9_a340_4aaf_8954_eb497e3c8d3d.slice/crio-1a6907d7d1599d4a0d684bd0e91f03f80c01205e68a562ccc807e433a1cf5916 WatchSource:0}: Error finding container 1a6907d7d1599d4a0d684bd0e91f03f80c01205e68a562ccc807e433a1cf5916: Status 404 returned error can't find the container with id 1a6907d7d1599d4a0d684bd0e91f03f80c01205e68a562ccc807e433a1cf5916 Oct 01 16:21:23 crc kubenswrapper[4764]: I1001 16:21:23.111935 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0aa8c29-c369-4a6d-a59c-ab2bd3175819","Type":"ContainerStarted","Data":"f70aea32467274064e64583e4ce396f05746549326096ecdac97747413d43eca"} Oct 01 16:21:23 crc kubenswrapper[4764]: I1001 16:21:23.114865 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8729af9-a340-4aaf-8954-eb497e3c8d3d","Type":"ContainerStarted","Data":"1a6907d7d1599d4a0d684bd0e91f03f80c01205e68a562ccc807e433a1cf5916"} Oct 01 16:21:23 crc kubenswrapper[4764]: I1001 16:21:23.735939 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83bf125f-bb40-40a8-911c-520f9ce115f0" path="/var/lib/kubelet/pods/83bf125f-bb40-40a8-911c-520f9ce115f0/volumes" Oct 01 16:21:23 crc kubenswrapper[4764]: I1001 16:21:23.737789 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9605b1c9-ed82-4e24-b150-f966fd09d0f2" path="/var/lib/kubelet/pods/9605b1c9-ed82-4e24-b150-f966fd09d0f2/volumes" Oct 01 16:21:24 crc kubenswrapper[4764]: I1001 16:21:24.141597 4764 generic.go:334] "Generic (PLEG): container finished" podID="1b99c5e9-3966-41d0-af69-2d0eb7a86d25" containerID="1a80f7f6f8927cfdb5534a5d0fc06b5b6fbfdd26e2766c8df9b54722bcec1a8d" exitCode=0 Oct 01 16:21:24 crc kubenswrapper[4764]: I1001 16:21:24.141735 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sxv5x" event={"ID":"1b99c5e9-3966-41d0-af69-2d0eb7a86d25","Type":"ContainerDied","Data":"1a80f7f6f8927cfdb5534a5d0fc06b5b6fbfdd26e2766c8df9b54722bcec1a8d"} Oct 01 16:21:24 crc kubenswrapper[4764]: I1001 16:21:24.156894 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0aa8c29-c369-4a6d-a59c-ab2bd3175819","Type":"ContainerStarted","Data":"b4c12b63e1ee8008fa5648fa6cbbe020eda19b95f4a985767e3c949da3e9e973"} Oct 01 16:21:24 crc kubenswrapper[4764]: I1001 16:21:24.156972 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0aa8c29-c369-4a6d-a59c-ab2bd3175819","Type":"ContainerStarted","Data":"2aabedb919decbef54e40c4f1906f9b2f6aabf0ccffe729437036193121cf58d"} Oct 01 16:21:24 crc kubenswrapper[4764]: I1001 16:21:24.162631 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8729af9-a340-4aaf-8954-eb497e3c8d3d","Type":"ContainerStarted","Data":"392dba833bacfd5bf974c5ad1e54ae0cd28af0a17d6d7d9303595d6d2900ad7b"} Oct 01 16:21:24 crc kubenswrapper[4764]: I1001 16:21:24.192805 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.192789518 podStartE2EDuration="2.192789518s" podCreationTimestamp="2025-10-01 16:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:21:24.187709042 +0000 UTC m=+1147.187355877" watchObservedRunningTime="2025-10-01 16:21:24.192789518 +0000 UTC m=+1147.192436353" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.176700 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8729af9-a340-4aaf-8954-eb497e3c8d3d","Type":"ContainerStarted","Data":"560055be3fe8e1a63f2891836f506775679514e7aa5f3056c4025b5a3bd8f510"} Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.413261 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.413733 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.508858 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.537322 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.596112 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.718004 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.737743 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-combined-ca-bundle\") pod \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.737864 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-config-data\") pod \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.738014 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgqzr\" (UniqueName: \"kubernetes.io/projected/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-kube-api-access-hgqzr\") pod \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.738765 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-scripts\") pod \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\" (UID: \"1b99c5e9-3966-41d0-af69-2d0eb7a86d25\") " Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.744820 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-kube-api-access-hgqzr" (OuterVolumeSpecName: "kube-api-access-hgqzr") pod "1b99c5e9-3966-41d0-af69-2d0eb7a86d25" (UID: "1b99c5e9-3966-41d0-af69-2d0eb7a86d25"). InnerVolumeSpecName "kube-api-access-hgqzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.746727 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-scripts" (OuterVolumeSpecName: "scripts") pod "1b99c5e9-3966-41d0-af69-2d0eb7a86d25" (UID: "1b99c5e9-3966-41d0-af69-2d0eb7a86d25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.784288 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b99c5e9-3966-41d0-af69-2d0eb7a86d25" (UID: "1b99c5e9-3966-41d0-af69-2d0eb7a86d25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.790330 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-config-data" (OuterVolumeSpecName: "config-data") pod "1b99c5e9-3966-41d0-af69-2d0eb7a86d25" (UID: "1b99c5e9-3966-41d0-af69-2d0eb7a86d25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.841390 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgqzr\" (UniqueName: \"kubernetes.io/projected/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-kube-api-access-hgqzr\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.841424 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.841434 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:25 crc kubenswrapper[4764]: I1001 16:21:25.841446 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b99c5e9-3966-41d0-af69-2d0eb7a86d25-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.027619 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.117941 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-s6tcb"] Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.118213 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" podUID="79e70d5f-7ec6-4c79-9044-b9495bf01054" containerName="dnsmasq-dns" containerID="cri-o://09f1e536be9102decfd85029ab29aeb992b41c781c38bcee536801f7f40a461e" gracePeriod=10 Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.212716 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8729af9-a340-4aaf-8954-eb497e3c8d3d","Type":"ContainerStarted","Data":"1f2c665a545383505f17971b6698b933f435f142ac19a532fa89b3c91126a9fc"} Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.218611 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sxv5x" event={"ID":"1b99c5e9-3966-41d0-af69-2d0eb7a86d25","Type":"ContainerDied","Data":"80c6d8c2937ce2b8131cc5d994bead023ddaeb71aac16a87a30a4261f3c62643"} Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.218657 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80c6d8c2937ce2b8131cc5d994bead023ddaeb71aac16a87a30a4261f3c62643" Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.218762 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sxv5x" Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.229313 4764 generic.go:334] "Generic (PLEG): container finished" podID="c6bfb577-6442-4db0-b962-a89441eb7a9c" containerID="e164c82ca96c5f7cedee0ab66c69a19c4a1fe02bcb5a45e9eb658c45143c73f3" exitCode=0 Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.230190 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r2sz8" event={"ID":"c6bfb577-6442-4db0-b962-a89441eb7a9c","Type":"ContainerDied","Data":"e164c82ca96c5f7cedee0ab66c69a19c4a1fe02bcb5a45e9eb658c45143c73f3"} Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.368937 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.390405 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.390616 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="29147115-0a7e-431e-9c9f-609ac6547ae7" containerName="nova-api-log" containerID="cri-o://5e2855d6c89e7d34ad2846a73c8a485eb15b8dba6a1d8db70fe39daca0369ab0" gracePeriod=30 Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.390745 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="29147115-0a7e-431e-9c9f-609ac6547ae7" containerName="nova-api-api" containerID="cri-o://03cdb772aa1c9e95835840fb1b5823a012e11cb175b6bd8f4750ce44e48aa7fa" gracePeriod=30 Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.415341 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.419124 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="29147115-0a7e-431e-9c9f-609ac6547ae7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.169:8774/\": EOF (Client.Timeout exceeded while awaiting headers)" Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.419361 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="29147115-0a7e-431e-9c9f-609ac6547ae7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.169:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.548074 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.548279 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a0aa8c29-c369-4a6d-a59c-ab2bd3175819" containerName="nova-metadata-log" containerID="cri-o://2aabedb919decbef54e40c4f1906f9b2f6aabf0ccffe729437036193121cf58d" gracePeriod=30 Oct 01 16:21:26 crc kubenswrapper[4764]: I1001 16:21:26.548411 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a0aa8c29-c369-4a6d-a59c-ab2bd3175819" containerName="nova-metadata-metadata" containerID="cri-o://b4c12b63e1ee8008fa5648fa6cbbe020eda19b95f4a985767e3c949da3e9e973" gracePeriod=30 Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.181215 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.255494 4764 generic.go:334] "Generic (PLEG): container finished" podID="a0aa8c29-c369-4a6d-a59c-ab2bd3175819" containerID="b4c12b63e1ee8008fa5648fa6cbbe020eda19b95f4a985767e3c949da3e9e973" exitCode=0 Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.255549 4764 generic.go:334] "Generic (PLEG): container finished" podID="a0aa8c29-c369-4a6d-a59c-ab2bd3175819" containerID="2aabedb919decbef54e40c4f1906f9b2f6aabf0ccffe729437036193121cf58d" exitCode=143 Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.255627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0aa8c29-c369-4a6d-a59c-ab2bd3175819","Type":"ContainerDied","Data":"b4c12b63e1ee8008fa5648fa6cbbe020eda19b95f4a985767e3c949da3e9e973"} Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.255660 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0aa8c29-c369-4a6d-a59c-ab2bd3175819","Type":"ContainerDied","Data":"2aabedb919decbef54e40c4f1906f9b2f6aabf0ccffe729437036193121cf58d"} Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.255675 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a0aa8c29-c369-4a6d-a59c-ab2bd3175819","Type":"ContainerDied","Data":"f70aea32467274064e64583e4ce396f05746549326096ecdac97747413d43eca"} Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.255704 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f70aea32467274064e64583e4ce396f05746549326096ecdac97747413d43eca" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.277987 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8729af9-a340-4aaf-8954-eb497e3c8d3d","Type":"ContainerStarted","Data":"a74fb18744ae197e44b8aa0650bbf96f420b16e2f2a04a67cc5f701b1705ca2a"} Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.278430 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw89j\" (UniqueName: \"kubernetes.io/projected/79e70d5f-7ec6-4c79-9044-b9495bf01054-kube-api-access-tw89j\") pod \"79e70d5f-7ec6-4c79-9044-b9495bf01054\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.278548 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-ovsdbserver-nb\") pod \"79e70d5f-7ec6-4c79-9044-b9495bf01054\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.278559 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.278636 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-config\") pod \"79e70d5f-7ec6-4c79-9044-b9495bf01054\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.278657 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-ovsdbserver-sb\") pod \"79e70d5f-7ec6-4c79-9044-b9495bf01054\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.278720 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-dns-svc\") pod \"79e70d5f-7ec6-4c79-9044-b9495bf01054\" (UID: \"79e70d5f-7ec6-4c79-9044-b9495bf01054\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.290048 4764 generic.go:334] "Generic (PLEG): container finished" podID="29147115-0a7e-431e-9c9f-609ac6547ae7" containerID="5e2855d6c89e7d34ad2846a73c8a485eb15b8dba6a1d8db70fe39daca0369ab0" exitCode=143 Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.290131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29147115-0a7e-431e-9c9f-609ac6547ae7","Type":"ContainerDied","Data":"5e2855d6c89e7d34ad2846a73c8a485eb15b8dba6a1d8db70fe39daca0369ab0"} Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.290989 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e70d5f-7ec6-4c79-9044-b9495bf01054-kube-api-access-tw89j" (OuterVolumeSpecName: "kube-api-access-tw89j") pod "79e70d5f-7ec6-4c79-9044-b9495bf01054" (UID: "79e70d5f-7ec6-4c79-9044-b9495bf01054"). InnerVolumeSpecName "kube-api-access-tw89j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.292527 4764 generic.go:334] "Generic (PLEG): container finished" podID="79e70d5f-7ec6-4c79-9044-b9495bf01054" containerID="09f1e536be9102decfd85029ab29aeb992b41c781c38bcee536801f7f40a461e" exitCode=0 Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.292697 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.293047 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" event={"ID":"79e70d5f-7ec6-4c79-9044-b9495bf01054","Type":"ContainerDied","Data":"09f1e536be9102decfd85029ab29aeb992b41c781c38bcee536801f7f40a461e"} Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.293092 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-s6tcb" event={"ID":"79e70d5f-7ec6-4c79-9044-b9495bf01054","Type":"ContainerDied","Data":"129617a6acb65f61aadc69d0beae5e98a04cc95236f111a3accf8fc97b688277"} Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.293109 4764 scope.go:117] "RemoveContainer" containerID="09f1e536be9102decfd85029ab29aeb992b41c781c38bcee536801f7f40a461e" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.314996 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.674205008 podStartE2EDuration="5.314980497s" podCreationTimestamp="2025-10-01 16:21:22 +0000 UTC" firstStartedPulling="2025-10-01 16:21:23.064682109 +0000 UTC m=+1146.064328944" lastFinishedPulling="2025-10-01 16:21:26.705457598 +0000 UTC m=+1149.705104433" observedRunningTime="2025-10-01 16:21:27.31223305 +0000 UTC m=+1150.311879885" watchObservedRunningTime="2025-10-01 16:21:27.314980497 +0000 UTC m=+1150.314627332" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.352424 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.377723 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "79e70d5f-7ec6-4c79-9044-b9495bf01054" (UID: "79e70d5f-7ec6-4c79-9044-b9495bf01054"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.381446 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.381587 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw89j\" (UniqueName: \"kubernetes.io/projected/79e70d5f-7ec6-4c79-9044-b9495bf01054-kube-api-access-tw89j\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.388017 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-config" (OuterVolumeSpecName: "config") pod "79e70d5f-7ec6-4c79-9044-b9495bf01054" (UID: "79e70d5f-7ec6-4c79-9044-b9495bf01054"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.414921 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79e70d5f-7ec6-4c79-9044-b9495bf01054" (UID: "79e70d5f-7ec6-4c79-9044-b9495bf01054"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.427193 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "79e70d5f-7ec6-4c79-9044-b9495bf01054" (UID: "79e70d5f-7ec6-4c79-9044-b9495bf01054"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.445397 4764 scope.go:117] "RemoveContainer" containerID="e5b5aef8ea89e6859231b449e2f0cbd2ab6860125700b4c388300fe64eb06a46" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.469702 4764 scope.go:117] "RemoveContainer" containerID="09f1e536be9102decfd85029ab29aeb992b41c781c38bcee536801f7f40a461e" Oct 01 16:21:27 crc kubenswrapper[4764]: E1001 16:21:27.470239 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09f1e536be9102decfd85029ab29aeb992b41c781c38bcee536801f7f40a461e\": container with ID starting with 09f1e536be9102decfd85029ab29aeb992b41c781c38bcee536801f7f40a461e not found: ID does not exist" containerID="09f1e536be9102decfd85029ab29aeb992b41c781c38bcee536801f7f40a461e" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.470268 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09f1e536be9102decfd85029ab29aeb992b41c781c38bcee536801f7f40a461e"} err="failed to get container status \"09f1e536be9102decfd85029ab29aeb992b41c781c38bcee536801f7f40a461e\": rpc error: code = NotFound desc = could not find container \"09f1e536be9102decfd85029ab29aeb992b41c781c38bcee536801f7f40a461e\": container with ID starting with 09f1e536be9102decfd85029ab29aeb992b41c781c38bcee536801f7f40a461e not found: ID does not exist" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.470287 4764 scope.go:117] "RemoveContainer" containerID="e5b5aef8ea89e6859231b449e2f0cbd2ab6860125700b4c388300fe64eb06a46" Oct 01 16:21:27 crc kubenswrapper[4764]: E1001 16:21:27.470845 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5b5aef8ea89e6859231b449e2f0cbd2ab6860125700b4c388300fe64eb06a46\": container with ID starting with e5b5aef8ea89e6859231b449e2f0cbd2ab6860125700b4c388300fe64eb06a46 not found: ID does not exist" containerID="e5b5aef8ea89e6859231b449e2f0cbd2ab6860125700b4c388300fe64eb06a46" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.470891 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5b5aef8ea89e6859231b449e2f0cbd2ab6860125700b4c388300fe64eb06a46"} err="failed to get container status \"e5b5aef8ea89e6859231b449e2f0cbd2ab6860125700b4c388300fe64eb06a46\": rpc error: code = NotFound desc = could not find container \"e5b5aef8ea89e6859231b449e2f0cbd2ab6860125700b4c388300fe64eb06a46\": container with ID starting with e5b5aef8ea89e6859231b449e2f0cbd2ab6860125700b4c388300fe64eb06a46 not found: ID does not exist" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.484319 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bkjt\" (UniqueName: \"kubernetes.io/projected/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-kube-api-access-5bkjt\") pod \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.484358 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-combined-ca-bundle\") pod \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.484525 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-config-data\") pod \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.484550 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-logs\") pod \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.484884 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-nova-metadata-tls-certs\") pod \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\" (UID: \"a0aa8c29-c369-4a6d-a59c-ab2bd3175819\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.486331 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.486395 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.486406 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79e70d5f-7ec6-4c79-9044-b9495bf01054-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.489557 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-logs" (OuterVolumeSpecName: "logs") pod "a0aa8c29-c369-4a6d-a59c-ab2bd3175819" (UID: "a0aa8c29-c369-4a6d-a59c-ab2bd3175819"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.493271 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-kube-api-access-5bkjt" (OuterVolumeSpecName: "kube-api-access-5bkjt") pod "a0aa8c29-c369-4a6d-a59c-ab2bd3175819" (UID: "a0aa8c29-c369-4a6d-a59c-ab2bd3175819"). InnerVolumeSpecName "kube-api-access-5bkjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.517165 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0aa8c29-c369-4a6d-a59c-ab2bd3175819" (UID: "a0aa8c29-c369-4a6d-a59c-ab2bd3175819"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.547288 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a0aa8c29-c369-4a6d-a59c-ab2bd3175819" (UID: "a0aa8c29-c369-4a6d-a59c-ab2bd3175819"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.547308 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-config-data" (OuterVolumeSpecName: "config-data") pod "a0aa8c29-c369-4a6d-a59c-ab2bd3175819" (UID: "a0aa8c29-c369-4a6d-a59c-ab2bd3175819"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.597357 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bkjt\" (UniqueName: \"kubernetes.io/projected/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-kube-api-access-5bkjt\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.597391 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.597400 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.597410 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.597418 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0aa8c29-c369-4a6d-a59c-ab2bd3175819-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.634509 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-s6tcb"] Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.642752 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-s6tcb"] Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.653536 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.737399 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e70d5f-7ec6-4c79-9044-b9495bf01054" path="/var/lib/kubelet/pods/79e70d5f-7ec6-4c79-9044-b9495bf01054/volumes" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.799736 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvvtz\" (UniqueName: \"kubernetes.io/projected/c6bfb577-6442-4db0-b962-a89441eb7a9c-kube-api-access-cvvtz\") pod \"c6bfb577-6442-4db0-b962-a89441eb7a9c\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.799794 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-scripts\") pod \"c6bfb577-6442-4db0-b962-a89441eb7a9c\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.800284 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-combined-ca-bundle\") pod \"c6bfb577-6442-4db0-b962-a89441eb7a9c\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.800463 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-config-data\") pod \"c6bfb577-6442-4db0-b962-a89441eb7a9c\" (UID: \"c6bfb577-6442-4db0-b962-a89441eb7a9c\") " Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.807226 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6bfb577-6442-4db0-b962-a89441eb7a9c-kube-api-access-cvvtz" (OuterVolumeSpecName: "kube-api-access-cvvtz") pod "c6bfb577-6442-4db0-b962-a89441eb7a9c" (UID: "c6bfb577-6442-4db0-b962-a89441eb7a9c"). InnerVolumeSpecName "kube-api-access-cvvtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.809635 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-scripts" (OuterVolumeSpecName: "scripts") pod "c6bfb577-6442-4db0-b962-a89441eb7a9c" (UID: "c6bfb577-6442-4db0-b962-a89441eb7a9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.834139 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-config-data" (OuterVolumeSpecName: "config-data") pod "c6bfb577-6442-4db0-b962-a89441eb7a9c" (UID: "c6bfb577-6442-4db0-b962-a89441eb7a9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.843018 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6bfb577-6442-4db0-b962-a89441eb7a9c" (UID: "c6bfb577-6442-4db0-b962-a89441eb7a9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.902432 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvvtz\" (UniqueName: \"kubernetes.io/projected/c6bfb577-6442-4db0-b962-a89441eb7a9c-kube-api-access-cvvtz\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.902457 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.902469 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:27 crc kubenswrapper[4764]: I1001 16:21:27.902478 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bfb577-6442-4db0-b962-a89441eb7a9c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.344271 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.345167 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r2sz8" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.353905 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r2sz8" event={"ID":"c6bfb577-6442-4db0-b962-a89441eb7a9c","Type":"ContainerDied","Data":"23e3e95e00f41a25f706a48614a24cc01267c2a0ae032b3c04014b2c045ddf9d"} Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.353939 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23e3e95e00f41a25f706a48614a24cc01267c2a0ae032b3c04014b2c045ddf9d" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.353956 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 16:21:28 crc kubenswrapper[4764]: E1001 16:21:28.354259 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b99c5e9-3966-41d0-af69-2d0eb7a86d25" containerName="nova-manage" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.354274 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b99c5e9-3966-41d0-af69-2d0eb7a86d25" containerName="nova-manage" Oct 01 16:21:28 crc kubenswrapper[4764]: E1001 16:21:28.354304 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0aa8c29-c369-4a6d-a59c-ab2bd3175819" containerName="nova-metadata-log" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.354311 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0aa8c29-c369-4a6d-a59c-ab2bd3175819" containerName="nova-metadata-log" Oct 01 16:21:28 crc kubenswrapper[4764]: E1001 16:21:28.354324 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e70d5f-7ec6-4c79-9044-b9495bf01054" containerName="init" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.354330 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e70d5f-7ec6-4c79-9044-b9495bf01054" containerName="init" Oct 01 16:21:28 crc kubenswrapper[4764]: E1001 16:21:28.354339 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6bfb577-6442-4db0-b962-a89441eb7a9c" containerName="nova-cell1-conductor-db-sync" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.354345 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bfb577-6442-4db0-b962-a89441eb7a9c" containerName="nova-cell1-conductor-db-sync" Oct 01 16:21:28 crc kubenswrapper[4764]: E1001 16:21:28.354353 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0aa8c29-c369-4a6d-a59c-ab2bd3175819" containerName="nova-metadata-metadata" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.354358 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0aa8c29-c369-4a6d-a59c-ab2bd3175819" containerName="nova-metadata-metadata" Oct 01 16:21:28 crc kubenswrapper[4764]: E1001 16:21:28.354367 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e70d5f-7ec6-4c79-9044-b9495bf01054" containerName="dnsmasq-dns" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.354373 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e70d5f-7ec6-4c79-9044-b9495bf01054" containerName="dnsmasq-dns" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.354513 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0aa8c29-c369-4a6d-a59c-ab2bd3175819" containerName="nova-metadata-metadata" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.354529 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b99c5e9-3966-41d0-af69-2d0eb7a86d25" containerName="nova-manage" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.354536 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0aa8c29-c369-4a6d-a59c-ab2bd3175819" containerName="nova-metadata-log" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.354543 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e70d5f-7ec6-4c79-9044-b9495bf01054" containerName="dnsmasq-dns" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.354561 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6bfb577-6442-4db0-b962-a89441eb7a9c" containerName="nova-cell1-conductor-db-sync" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.355748 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.355837 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.357566 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d1e50ddd-40e7-470a-8454-aab5337c9469" containerName="nova-scheduler-scheduler" containerID="cri-o://e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98" gracePeriod=30 Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.393524 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.412289 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78988164-5797-4cee-a8a9-7f87adeb170a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"78988164-5797-4cee-a8a9-7f87adeb170a\") " pod="openstack/nova-cell1-conductor-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.412359 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78988164-5797-4cee-a8a9-7f87adeb170a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"78988164-5797-4cee-a8a9-7f87adeb170a\") " pod="openstack/nova-cell1-conductor-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.412419 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnqlt\" (UniqueName: \"kubernetes.io/projected/78988164-5797-4cee-a8a9-7f87adeb170a-kube-api-access-vnqlt\") pod \"nova-cell1-conductor-0\" (UID: \"78988164-5797-4cee-a8a9-7f87adeb170a\") " pod="openstack/nova-cell1-conductor-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.471575 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.480331 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.500706 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.502697 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.504909 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.507326 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.511349 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.513835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78988164-5797-4cee-a8a9-7f87adeb170a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"78988164-5797-4cee-a8a9-7f87adeb170a\") " pod="openstack/nova-cell1-conductor-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.513876 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78988164-5797-4cee-a8a9-7f87adeb170a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"78988164-5797-4cee-a8a9-7f87adeb170a\") " pod="openstack/nova-cell1-conductor-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.513905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnqlt\" (UniqueName: \"kubernetes.io/projected/78988164-5797-4cee-a8a9-7f87adeb170a-kube-api-access-vnqlt\") pod \"nova-cell1-conductor-0\" (UID: \"78988164-5797-4cee-a8a9-7f87adeb170a\") " pod="openstack/nova-cell1-conductor-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.519769 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78988164-5797-4cee-a8a9-7f87adeb170a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"78988164-5797-4cee-a8a9-7f87adeb170a\") " pod="openstack/nova-cell1-conductor-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.540669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78988164-5797-4cee-a8a9-7f87adeb170a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"78988164-5797-4cee-a8a9-7f87adeb170a\") " pod="openstack/nova-cell1-conductor-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.543461 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnqlt\" (UniqueName: \"kubernetes.io/projected/78988164-5797-4cee-a8a9-7f87adeb170a-kube-api-access-vnqlt\") pod \"nova-cell1-conductor-0\" (UID: \"78988164-5797-4cee-a8a9-7f87adeb170a\") " pod="openstack/nova-cell1-conductor-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.615072 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d905f1ae-9a08-4050-8661-c7069a8d8b83-logs\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.615156 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpfdm\" (UniqueName: \"kubernetes.io/projected/d905f1ae-9a08-4050-8661-c7069a8d8b83-kube-api-access-jpfdm\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.615223 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-config-data\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.615244 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.615269 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.706676 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.716444 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d905f1ae-9a08-4050-8661-c7069a8d8b83-logs\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.716520 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpfdm\" (UniqueName: \"kubernetes.io/projected/d905f1ae-9a08-4050-8661-c7069a8d8b83-kube-api-access-jpfdm\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.716585 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-config-data\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.716607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.716643 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.716978 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d905f1ae-9a08-4050-8661-c7069a8d8b83-logs\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.721685 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-config-data\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.725533 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.726530 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.744708 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpfdm\" (UniqueName: \"kubernetes.io/projected/d905f1ae-9a08-4050-8661-c7069a8d8b83-kube-api-access-jpfdm\") pod \"nova-metadata-0\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " pod="openstack/nova-metadata-0" Oct 01 16:21:28 crc kubenswrapper[4764]: I1001 16:21:28.894861 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:21:29 crc kubenswrapper[4764]: I1001 16:21:29.175926 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 16:21:29 crc kubenswrapper[4764]: W1001 16:21:29.182694 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78988164_5797_4cee_a8a9_7f87adeb170a.slice/crio-4806167aceca9e6494404ed595f08bf2434cb4f3ac8a95287f7f94dd4fafc3c3 WatchSource:0}: Error finding container 4806167aceca9e6494404ed595f08bf2434cb4f3ac8a95287f7f94dd4fafc3c3: Status 404 returned error can't find the container with id 4806167aceca9e6494404ed595f08bf2434cb4f3ac8a95287f7f94dd4fafc3c3 Oct 01 16:21:29 crc kubenswrapper[4764]: I1001 16:21:29.333189 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:21:29 crc kubenswrapper[4764]: W1001 16:21:29.346875 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd905f1ae_9a08_4050_8661_c7069a8d8b83.slice/crio-4bed8cf3a22a2b91d9f383ed0e21181d84cfbfa2d5a4ca61519d72c1aeaa10a1 WatchSource:0}: Error finding container 4bed8cf3a22a2b91d9f383ed0e21181d84cfbfa2d5a4ca61519d72c1aeaa10a1: Status 404 returned error can't find the container with id 4bed8cf3a22a2b91d9f383ed0e21181d84cfbfa2d5a4ca61519d72c1aeaa10a1 Oct 01 16:21:29 crc kubenswrapper[4764]: I1001 16:21:29.358909 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"78988164-5797-4cee-a8a9-7f87adeb170a","Type":"ContainerStarted","Data":"4806167aceca9e6494404ed595f08bf2434cb4f3ac8a95287f7f94dd4fafc3c3"} Oct 01 16:21:29 crc kubenswrapper[4764]: I1001 16:21:29.732521 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0aa8c29-c369-4a6d-a59c-ab2bd3175819" path="/var/lib/kubelet/pods/a0aa8c29-c369-4a6d-a59c-ab2bd3175819/volumes" Oct 01 16:21:30 crc kubenswrapper[4764]: I1001 16:21:30.369401 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d905f1ae-9a08-4050-8661-c7069a8d8b83","Type":"ContainerStarted","Data":"71e53d85bb1c680d27d2413f311b751f562559fafa95232122abdf85cfbf8c8f"} Oct 01 16:21:30 crc kubenswrapper[4764]: I1001 16:21:30.369451 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d905f1ae-9a08-4050-8661-c7069a8d8b83","Type":"ContainerStarted","Data":"e436bd0705817adb68f3bd582b5ab2cc47f55323354a127aa08de20bd27d32b4"} Oct 01 16:21:30 crc kubenswrapper[4764]: I1001 16:21:30.369463 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d905f1ae-9a08-4050-8661-c7069a8d8b83","Type":"ContainerStarted","Data":"4bed8cf3a22a2b91d9f383ed0e21181d84cfbfa2d5a4ca61519d72c1aeaa10a1"} Oct 01 16:21:30 crc kubenswrapper[4764]: I1001 16:21:30.374543 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"78988164-5797-4cee-a8a9-7f87adeb170a","Type":"ContainerStarted","Data":"1245edbde5be3638676532fe607d21ea8e7541c1defc74015f4b1ba8ceaf7dd3"} Oct 01 16:21:30 crc kubenswrapper[4764]: I1001 16:21:30.375329 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 01 16:21:30 crc kubenswrapper[4764]: I1001 16:21:30.407215 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.407188028 podStartE2EDuration="2.407188028s" podCreationTimestamp="2025-10-01 16:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:21:30.397161441 +0000 UTC m=+1153.396808276" watchObservedRunningTime="2025-10-01 16:21:30.407188028 +0000 UTC m=+1153.406834883" Oct 01 16:21:30 crc kubenswrapper[4764]: I1001 16:21:30.427512 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.427488099 podStartE2EDuration="2.427488099s" podCreationTimestamp="2025-10-01 16:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:21:30.417079032 +0000 UTC m=+1153.416725877" watchObservedRunningTime="2025-10-01 16:21:30.427488099 +0000 UTC m=+1153.427134944" Oct 01 16:21:30 crc kubenswrapper[4764]: E1001 16:21:30.509454 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98 is running failed: container process not found" containerID="e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 16:21:30 crc kubenswrapper[4764]: E1001 16:21:30.509717 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98 is running failed: container process not found" containerID="e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 16:21:30 crc kubenswrapper[4764]: E1001 16:21:30.509921 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98 is running failed: container process not found" containerID="e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 16:21:30 crc kubenswrapper[4764]: E1001 16:21:30.509957 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d1e50ddd-40e7-470a-8454-aab5337c9469" containerName="nova-scheduler-scheduler" Oct 01 16:21:30 crc kubenswrapper[4764]: I1001 16:21:30.955300 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.055668 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqhqd\" (UniqueName: \"kubernetes.io/projected/d1e50ddd-40e7-470a-8454-aab5337c9469-kube-api-access-wqhqd\") pod \"d1e50ddd-40e7-470a-8454-aab5337c9469\" (UID: \"d1e50ddd-40e7-470a-8454-aab5337c9469\") " Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.055719 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e50ddd-40e7-470a-8454-aab5337c9469-combined-ca-bundle\") pod \"d1e50ddd-40e7-470a-8454-aab5337c9469\" (UID: \"d1e50ddd-40e7-470a-8454-aab5337c9469\") " Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.055847 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e50ddd-40e7-470a-8454-aab5337c9469-config-data\") pod \"d1e50ddd-40e7-470a-8454-aab5337c9469\" (UID: \"d1e50ddd-40e7-470a-8454-aab5337c9469\") " Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.062661 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e50ddd-40e7-470a-8454-aab5337c9469-kube-api-access-wqhqd" (OuterVolumeSpecName: "kube-api-access-wqhqd") pod "d1e50ddd-40e7-470a-8454-aab5337c9469" (UID: "d1e50ddd-40e7-470a-8454-aab5337c9469"). InnerVolumeSpecName "kube-api-access-wqhqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.105696 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e50ddd-40e7-470a-8454-aab5337c9469-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1e50ddd-40e7-470a-8454-aab5337c9469" (UID: "d1e50ddd-40e7-470a-8454-aab5337c9469"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.112527 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e50ddd-40e7-470a-8454-aab5337c9469-config-data" (OuterVolumeSpecName: "config-data") pod "d1e50ddd-40e7-470a-8454-aab5337c9469" (UID: "d1e50ddd-40e7-470a-8454-aab5337c9469"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.158075 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqhqd\" (UniqueName: \"kubernetes.io/projected/d1e50ddd-40e7-470a-8454-aab5337c9469-kube-api-access-wqhqd\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.158126 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e50ddd-40e7-470a-8454-aab5337c9469-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.158143 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e50ddd-40e7-470a-8454-aab5337c9469-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.387559 4764 generic.go:334] "Generic (PLEG): container finished" podID="d1e50ddd-40e7-470a-8454-aab5337c9469" containerID="e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98" exitCode=0 Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.387750 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d1e50ddd-40e7-470a-8454-aab5337c9469","Type":"ContainerDied","Data":"e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98"} Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.387820 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d1e50ddd-40e7-470a-8454-aab5337c9469","Type":"ContainerDied","Data":"77a1a20e88a0615de7897a18580d54a4b97a0938915fe0f2857c47402d41e6c0"} Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.387848 4764 scope.go:117] "RemoveContainer" containerID="e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.388069 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.428719 4764 scope.go:117] "RemoveContainer" containerID="e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98" Oct 01 16:21:31 crc kubenswrapper[4764]: E1001 16:21:31.430080 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98\": container with ID starting with e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98 not found: ID does not exist" containerID="e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.430132 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98"} err="failed to get container status \"e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98\": rpc error: code = NotFound desc = could not find container \"e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98\": container with ID starting with e172dd4f29309cac76301d1cd871096e41875f426c1f5b3af441f6ddf92dae98 not found: ID does not exist" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.434423 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.451448 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.464245 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:21:31 crc kubenswrapper[4764]: E1001 16:21:31.466470 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e50ddd-40e7-470a-8454-aab5337c9469" containerName="nova-scheduler-scheduler" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.466510 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e50ddd-40e7-470a-8454-aab5337c9469" containerName="nova-scheduler-scheduler" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.470623 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e50ddd-40e7-470a-8454-aab5337c9469" containerName="nova-scheduler-scheduler" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.483330 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.483548 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.492148 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.668730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ca659a-c997-4901-abd9-0200e6f16aea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"60ca659a-c997-4901-abd9-0200e6f16aea\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.668850 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ca659a-c997-4901-abd9-0200e6f16aea-config-data\") pod \"nova-scheduler-0\" (UID: \"60ca659a-c997-4901-abd9-0200e6f16aea\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.668996 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhj2z\" (UniqueName: \"kubernetes.io/projected/60ca659a-c997-4901-abd9-0200e6f16aea-kube-api-access-rhj2z\") pod \"nova-scheduler-0\" (UID: \"60ca659a-c997-4901-abd9-0200e6f16aea\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.747281 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e50ddd-40e7-470a-8454-aab5337c9469" path="/var/lib/kubelet/pods/d1e50ddd-40e7-470a-8454-aab5337c9469/volumes" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.770632 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhj2z\" (UniqueName: \"kubernetes.io/projected/60ca659a-c997-4901-abd9-0200e6f16aea-kube-api-access-rhj2z\") pod \"nova-scheduler-0\" (UID: \"60ca659a-c997-4901-abd9-0200e6f16aea\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.770813 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ca659a-c997-4901-abd9-0200e6f16aea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"60ca659a-c997-4901-abd9-0200e6f16aea\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.770882 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ca659a-c997-4901-abd9-0200e6f16aea-config-data\") pod \"nova-scheduler-0\" (UID: \"60ca659a-c997-4901-abd9-0200e6f16aea\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.777264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ca659a-c997-4901-abd9-0200e6f16aea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"60ca659a-c997-4901-abd9-0200e6f16aea\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.778160 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ca659a-c997-4901-abd9-0200e6f16aea-config-data\") pod \"nova-scheduler-0\" (UID: \"60ca659a-c997-4901-abd9-0200e6f16aea\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.803327 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhj2z\" (UniqueName: \"kubernetes.io/projected/60ca659a-c997-4901-abd9-0200e6f16aea-kube-api-access-rhj2z\") pod \"nova-scheduler-0\" (UID: \"60ca659a-c997-4901-abd9-0200e6f16aea\") " pod="openstack/nova-scheduler-0" Oct 01 16:21:31 crc kubenswrapper[4764]: I1001 16:21:31.819887 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.319796 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:21:32 crc kubenswrapper[4764]: W1001 16:21:32.321475 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60ca659a_c997_4901_abd9_0200e6f16aea.slice/crio-6c0aa38303daeaba412b20e6c0aa3ffdbb94c7828eb360072baf02eaaebfc106 WatchSource:0}: Error finding container 6c0aa38303daeaba412b20e6c0aa3ffdbb94c7828eb360072baf02eaaebfc106: Status 404 returned error can't find the container with id 6c0aa38303daeaba412b20e6c0aa3ffdbb94c7828eb360072baf02eaaebfc106 Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.401521 4764 generic.go:334] "Generic (PLEG): container finished" podID="29147115-0a7e-431e-9c9f-609ac6547ae7" containerID="03cdb772aa1c9e95835840fb1b5823a012e11cb175b6bd8f4750ce44e48aa7fa" exitCode=0 Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.402446 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29147115-0a7e-431e-9c9f-609ac6547ae7","Type":"ContainerDied","Data":"03cdb772aa1c9e95835840fb1b5823a012e11cb175b6bd8f4750ce44e48aa7fa"} Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.402564 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"29147115-0a7e-431e-9c9f-609ac6547ae7","Type":"ContainerDied","Data":"a555e614f897fb5a673d835b34248541845c1f609776bf08f25181ad87d87858"} Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.402645 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a555e614f897fb5a673d835b34248541845c1f609776bf08f25181ad87d87858" Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.406495 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"60ca659a-c997-4901-abd9-0200e6f16aea","Type":"ContainerStarted","Data":"6c0aa38303daeaba412b20e6c0aa3ffdbb94c7828eb360072baf02eaaebfc106"} Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.524105 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.684159 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29147115-0a7e-431e-9c9f-609ac6547ae7-logs\") pod \"29147115-0a7e-431e-9c9f-609ac6547ae7\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.684634 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29147115-0a7e-431e-9c9f-609ac6547ae7-config-data\") pod \"29147115-0a7e-431e-9c9f-609ac6547ae7\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.684747 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmh9t\" (UniqueName: \"kubernetes.io/projected/29147115-0a7e-431e-9c9f-609ac6547ae7-kube-api-access-nmh9t\") pod \"29147115-0a7e-431e-9c9f-609ac6547ae7\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.684789 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29147115-0a7e-431e-9c9f-609ac6547ae7-combined-ca-bundle\") pod \"29147115-0a7e-431e-9c9f-609ac6547ae7\" (UID: \"29147115-0a7e-431e-9c9f-609ac6547ae7\") " Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.685697 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29147115-0a7e-431e-9c9f-609ac6547ae7-logs" (OuterVolumeSpecName: "logs") pod "29147115-0a7e-431e-9c9f-609ac6547ae7" (UID: "29147115-0a7e-431e-9c9f-609ac6547ae7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.690856 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29147115-0a7e-431e-9c9f-609ac6547ae7-kube-api-access-nmh9t" (OuterVolumeSpecName: "kube-api-access-nmh9t") pod "29147115-0a7e-431e-9c9f-609ac6547ae7" (UID: "29147115-0a7e-431e-9c9f-609ac6547ae7"). InnerVolumeSpecName "kube-api-access-nmh9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.714419 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29147115-0a7e-431e-9c9f-609ac6547ae7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29147115-0a7e-431e-9c9f-609ac6547ae7" (UID: "29147115-0a7e-431e-9c9f-609ac6547ae7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.746299 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29147115-0a7e-431e-9c9f-609ac6547ae7-config-data" (OuterVolumeSpecName: "config-data") pod "29147115-0a7e-431e-9c9f-609ac6547ae7" (UID: "29147115-0a7e-431e-9c9f-609ac6547ae7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.787197 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29147115-0a7e-431e-9c9f-609ac6547ae7-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.787232 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmh9t\" (UniqueName: \"kubernetes.io/projected/29147115-0a7e-431e-9c9f-609ac6547ae7-kube-api-access-nmh9t\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.787246 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29147115-0a7e-431e-9c9f-609ac6547ae7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:32 crc kubenswrapper[4764]: I1001 16:21:32.787257 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29147115-0a7e-431e-9c9f-609ac6547ae7-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.422949 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"60ca659a-c997-4901-abd9-0200e6f16aea","Type":"ContainerStarted","Data":"5ae2bc29b554eff839b17a07ba1cc6a024038d7d8bf337f3c17debe4d17a2a8e"} Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.422982 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.458258 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.458231736 podStartE2EDuration="2.458231736s" podCreationTimestamp="2025-10-01 16:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:21:33.441922545 +0000 UTC m=+1156.441569390" watchObservedRunningTime="2025-10-01 16:21:33.458231736 +0000 UTC m=+1156.457878591" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.478380 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.491116 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.509037 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 16:21:33 crc kubenswrapper[4764]: E1001 16:21:33.509446 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29147115-0a7e-431e-9c9f-609ac6547ae7" containerName="nova-api-api" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.509467 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="29147115-0a7e-431e-9c9f-609ac6547ae7" containerName="nova-api-api" Oct 01 16:21:33 crc kubenswrapper[4764]: E1001 16:21:33.509496 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29147115-0a7e-431e-9c9f-609ac6547ae7" containerName="nova-api-log" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.509505 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="29147115-0a7e-431e-9c9f-609ac6547ae7" containerName="nova-api-log" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.509724 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="29147115-0a7e-431e-9c9f-609ac6547ae7" containerName="nova-api-log" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.509753 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="29147115-0a7e-431e-9c9f-609ac6547ae7" containerName="nova-api-api" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.512070 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.514262 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.524203 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.603852 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwszx\" (UniqueName: \"kubernetes.io/projected/64b12b65-db76-4f95-9ecc-d17e714bd5aa-kube-api-access-wwszx\") pod \"nova-api-0\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.603912 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b12b65-db76-4f95-9ecc-d17e714bd5aa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.604029 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b12b65-db76-4f95-9ecc-d17e714bd5aa-config-data\") pod \"nova-api-0\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.604128 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64b12b65-db76-4f95-9ecc-d17e714bd5aa-logs\") pod \"nova-api-0\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.705507 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b12b65-db76-4f95-9ecc-d17e714bd5aa-config-data\") pod \"nova-api-0\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.705563 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64b12b65-db76-4f95-9ecc-d17e714bd5aa-logs\") pod \"nova-api-0\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.705619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwszx\" (UniqueName: \"kubernetes.io/projected/64b12b65-db76-4f95-9ecc-d17e714bd5aa-kube-api-access-wwszx\") pod \"nova-api-0\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.705658 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b12b65-db76-4f95-9ecc-d17e714bd5aa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.706237 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64b12b65-db76-4f95-9ecc-d17e714bd5aa-logs\") pod \"nova-api-0\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.712118 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b12b65-db76-4f95-9ecc-d17e714bd5aa-config-data\") pod \"nova-api-0\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.722706 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b12b65-db76-4f95-9ecc-d17e714bd5aa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.733970 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29147115-0a7e-431e-9c9f-609ac6547ae7" path="/var/lib/kubelet/pods/29147115-0a7e-431e-9c9f-609ac6547ae7/volumes" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.735407 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwszx\" (UniqueName: \"kubernetes.io/projected/64b12b65-db76-4f95-9ecc-d17e714bd5aa-kube-api-access-wwszx\") pod \"nova-api-0\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.874675 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.896137 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 16:21:33 crc kubenswrapper[4764]: I1001 16:21:33.896196 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 16:21:34 crc kubenswrapper[4764]: I1001 16:21:34.339607 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:21:34 crc kubenswrapper[4764]: W1001 16:21:34.347021 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64b12b65_db76_4f95_9ecc_d17e714bd5aa.slice/crio-bc888c01b06ac669c3fca59a21fc64213af2c104706df28e1edb8eba3183d4f4 WatchSource:0}: Error finding container bc888c01b06ac669c3fca59a21fc64213af2c104706df28e1edb8eba3183d4f4: Status 404 returned error can't find the container with id bc888c01b06ac669c3fca59a21fc64213af2c104706df28e1edb8eba3183d4f4 Oct 01 16:21:34 crc kubenswrapper[4764]: I1001 16:21:34.446939 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"64b12b65-db76-4f95-9ecc-d17e714bd5aa","Type":"ContainerStarted","Data":"bc888c01b06ac669c3fca59a21fc64213af2c104706df28e1edb8eba3183d4f4"} Oct 01 16:21:35 crc kubenswrapper[4764]: I1001 16:21:35.462657 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"64b12b65-db76-4f95-9ecc-d17e714bd5aa","Type":"ContainerStarted","Data":"172dc32709476de3216b752690003da31eed3807c2657995d1344516f44e49d2"} Oct 01 16:21:35 crc kubenswrapper[4764]: I1001 16:21:35.463534 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"64b12b65-db76-4f95-9ecc-d17e714bd5aa","Type":"ContainerStarted","Data":"dbe1f3fbb26a402d6b7fa743574ff829af4f8e426e6352e1c46c5dd5ce2b5e36"} Oct 01 16:21:35 crc kubenswrapper[4764]: I1001 16:21:35.498534 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.498500875 podStartE2EDuration="2.498500875s" podCreationTimestamp="2025-10-01 16:21:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:21:35.49143639 +0000 UTC m=+1158.491083285" watchObservedRunningTime="2025-10-01 16:21:35.498500875 +0000 UTC m=+1158.498147740" Oct 01 16:21:36 crc kubenswrapper[4764]: I1001 16:21:36.820267 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 16:21:38 crc kubenswrapper[4764]: I1001 16:21:38.755584 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 01 16:21:38 crc kubenswrapper[4764]: I1001 16:21:38.895970 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 16:21:38 crc kubenswrapper[4764]: I1001 16:21:38.896027 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 16:21:39 crc kubenswrapper[4764]: I1001 16:21:39.907415 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.179:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 16:21:39 crc kubenswrapper[4764]: I1001 16:21:39.907427 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.179:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 16:21:41 crc kubenswrapper[4764]: I1001 16:21:41.820592 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 16:21:41 crc kubenswrapper[4764]: I1001 16:21:41.868424 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 16:21:42 crc kubenswrapper[4764]: I1001 16:21:42.571358 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 16:21:43 crc kubenswrapper[4764]: I1001 16:21:43.875266 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 16:21:43 crc kubenswrapper[4764]: I1001 16:21:43.875323 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 16:21:44 crc kubenswrapper[4764]: I1001 16:21:44.958256 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="64b12b65-db76-4f95-9ecc-d17e714bd5aa" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 16:21:44 crc kubenswrapper[4764]: I1001 16:21:44.958256 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="64b12b65-db76-4f95-9ecc-d17e714bd5aa" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 16:21:48 crc kubenswrapper[4764]: I1001 16:21:48.905996 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 16:21:48 crc kubenswrapper[4764]: I1001 16:21:48.908977 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 16:21:48 crc kubenswrapper[4764]: I1001 16:21:48.914348 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 16:21:49 crc kubenswrapper[4764]: I1001 16:21:49.617443 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.521539 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.618478 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8bmm\" (UniqueName: \"kubernetes.io/projected/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-kube-api-access-f8bmm\") pod \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\" (UID: \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\") " Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.618925 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-combined-ca-bundle\") pod \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\" (UID: \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\") " Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.619176 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-config-data\") pod \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\" (UID: \"b8eb9e30-0cde-4a8a-86fa-9201a5efe701\") " Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.621245 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8eb9e30-0cde-4a8a-86fa-9201a5efe701" containerID="f723856086bad6af723892a7d1c5fc6d1ab10e5e8438cf57b39976db9758d981" exitCode=137 Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.621388 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.621422 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b8eb9e30-0cde-4a8a-86fa-9201a5efe701","Type":"ContainerDied","Data":"f723856086bad6af723892a7d1c5fc6d1ab10e5e8438cf57b39976db9758d981"} Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.621871 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b8eb9e30-0cde-4a8a-86fa-9201a5efe701","Type":"ContainerDied","Data":"97dfa3be2f8ece6cd4f347c3cc46d148740c40c192aa83d83451a9eb0aae88a1"} Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.621894 4764 scope.go:117] "RemoveContainer" containerID="f723856086bad6af723892a7d1c5fc6d1ab10e5e8438cf57b39976db9758d981" Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.631068 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-kube-api-access-f8bmm" (OuterVolumeSpecName: "kube-api-access-f8bmm") pod "b8eb9e30-0cde-4a8a-86fa-9201a5efe701" (UID: "b8eb9e30-0cde-4a8a-86fa-9201a5efe701"). InnerVolumeSpecName "kube-api-access-f8bmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.647916 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8eb9e30-0cde-4a8a-86fa-9201a5efe701" (UID: "b8eb9e30-0cde-4a8a-86fa-9201a5efe701"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.652612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-config-data" (OuterVolumeSpecName: "config-data") pod "b8eb9e30-0cde-4a8a-86fa-9201a5efe701" (UID: "b8eb9e30-0cde-4a8a-86fa-9201a5efe701"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.722800 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.722827 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8bmm\" (UniqueName: \"kubernetes.io/projected/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-kube-api-access-f8bmm\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.722840 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8eb9e30-0cde-4a8a-86fa-9201a5efe701-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.723360 4764 scope.go:117] "RemoveContainer" containerID="f723856086bad6af723892a7d1c5fc6d1ab10e5e8438cf57b39976db9758d981" Oct 01 16:21:50 crc kubenswrapper[4764]: E1001 16:21:50.723886 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f723856086bad6af723892a7d1c5fc6d1ab10e5e8438cf57b39976db9758d981\": container with ID starting with f723856086bad6af723892a7d1c5fc6d1ab10e5e8438cf57b39976db9758d981 not found: ID does not exist" containerID="f723856086bad6af723892a7d1c5fc6d1ab10e5e8438cf57b39976db9758d981" Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.724103 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f723856086bad6af723892a7d1c5fc6d1ab10e5e8438cf57b39976db9758d981"} err="failed to get container status \"f723856086bad6af723892a7d1c5fc6d1ab10e5e8438cf57b39976db9758d981\": rpc error: code = NotFound desc = could not find container \"f723856086bad6af723892a7d1c5fc6d1ab10e5e8438cf57b39976db9758d981\": container with ID starting with f723856086bad6af723892a7d1c5fc6d1ab10e5e8438cf57b39976db9758d981 not found: ID does not exist" Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.960697 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 16:21:50 crc kubenswrapper[4764]: I1001 16:21:50.970432 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.002400 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 16:21:51 crc kubenswrapper[4764]: E1001 16:21:51.002966 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8eb9e30-0cde-4a8a-86fa-9201a5efe701" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.002979 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8eb9e30-0cde-4a8a-86fa-9201a5efe701" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.003191 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8eb9e30-0cde-4a8a-86fa-9201a5efe701" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.003763 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.011594 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.011982 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.011810 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.019386 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.131923 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/384d37eb-2732-48d4-b38d-2befbd3d0cce-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.132547 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/384d37eb-2732-48d4-b38d-2befbd3d0cce-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.132830 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384d37eb-2732-48d4-b38d-2befbd3d0cce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.133110 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384d37eb-2732-48d4-b38d-2befbd3d0cce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.133402 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58rnv\" (UniqueName: \"kubernetes.io/projected/384d37eb-2732-48d4-b38d-2befbd3d0cce-kube-api-access-58rnv\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.235297 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/384d37eb-2732-48d4-b38d-2befbd3d0cce-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.235381 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384d37eb-2732-48d4-b38d-2befbd3d0cce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.235427 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384d37eb-2732-48d4-b38d-2befbd3d0cce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.235472 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58rnv\" (UniqueName: \"kubernetes.io/projected/384d37eb-2732-48d4-b38d-2befbd3d0cce-kube-api-access-58rnv\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.235501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/384d37eb-2732-48d4-b38d-2befbd3d0cce-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.238752 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/384d37eb-2732-48d4-b38d-2befbd3d0cce-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.239217 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/384d37eb-2732-48d4-b38d-2befbd3d0cce-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.241084 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/384d37eb-2732-48d4-b38d-2befbd3d0cce-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.241636 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/384d37eb-2732-48d4-b38d-2befbd3d0cce-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.254879 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58rnv\" (UniqueName: \"kubernetes.io/projected/384d37eb-2732-48d4-b38d-2befbd3d0cce-kube-api-access-58rnv\") pod \"nova-cell1-novncproxy-0\" (UID: \"384d37eb-2732-48d4-b38d-2befbd3d0cce\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.330954 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.740443 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8eb9e30-0cde-4a8a-86fa-9201a5efe701" path="/var/lib/kubelet/pods/b8eb9e30-0cde-4a8a-86fa-9201a5efe701/volumes" Oct 01 16:21:51 crc kubenswrapper[4764]: I1001 16:21:51.778982 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 16:21:52 crc kubenswrapper[4764]: I1001 16:21:52.596081 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 16:21:52 crc kubenswrapper[4764]: I1001 16:21:52.644776 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"384d37eb-2732-48d4-b38d-2befbd3d0cce","Type":"ContainerStarted","Data":"28d8944108fc523faf9210cbad57ebce9c581e84f7472d79706f429349d0c2b5"} Oct 01 16:21:52 crc kubenswrapper[4764]: I1001 16:21:52.644857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"384d37eb-2732-48d4-b38d-2befbd3d0cce","Type":"ContainerStarted","Data":"7b10d5d4ea226e100b8489c1dd70bc99d56290e19fae0a2dfaa7cd0f469e4843"} Oct 01 16:21:52 crc kubenswrapper[4764]: I1001 16:21:52.703489 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.703455914 podStartE2EDuration="2.703455914s" podCreationTimestamp="2025-10-01 16:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:21:52.691812557 +0000 UTC m=+1175.691459402" watchObservedRunningTime="2025-10-01 16:21:52.703455914 +0000 UTC m=+1175.703102789" Oct 01 16:21:53 crc kubenswrapper[4764]: I1001 16:21:53.879522 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 16:21:53 crc kubenswrapper[4764]: I1001 16:21:53.880161 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 16:21:53 crc kubenswrapper[4764]: I1001 16:21:53.880674 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 16:21:53 crc kubenswrapper[4764]: I1001 16:21:53.882850 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 16:21:54 crc kubenswrapper[4764]: I1001 16:21:54.661135 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 16:21:54 crc kubenswrapper[4764]: I1001 16:21:54.664611 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 16:21:54 crc kubenswrapper[4764]: I1001 16:21:54.930367 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-mmg6q"] Oct 01 16:21:54 crc kubenswrapper[4764]: I1001 16:21:54.940407 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:54 crc kubenswrapper[4764]: I1001 16:21:54.982103 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-mmg6q"] Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.009221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86r8t\" (UniqueName: \"kubernetes.io/projected/4a2a8617-a485-4228-94ff-874d395bc9a8-kube-api-access-86r8t\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.009279 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.009303 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.009376 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-config\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.009422 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-dns-svc\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.112584 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86r8t\" (UniqueName: \"kubernetes.io/projected/4a2a8617-a485-4228-94ff-874d395bc9a8-kube-api-access-86r8t\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.112637 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.112663 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.112730 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-config\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.112922 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-dns-svc\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.113884 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.114036 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-dns-svc\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.114178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-config\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.115375 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.130410 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86r8t\" (UniqueName: \"kubernetes.io/projected/4a2a8617-a485-4228-94ff-874d395bc9a8-kube-api-access-86r8t\") pod \"dnsmasq-dns-5b856c5697-mmg6q\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.265196 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:21:55 crc kubenswrapper[4764]: W1001 16:21:55.729867 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a2a8617_a485_4228_94ff_874d395bc9a8.slice/crio-d7d24143e3b9015fa62edb50a7f6c610f96fab99879b32fb52769f07653b5696 WatchSource:0}: Error finding container d7d24143e3b9015fa62edb50a7f6c610f96fab99879b32fb52769f07653b5696: Status 404 returned error can't find the container with id d7d24143e3b9015fa62edb50a7f6c610f96fab99879b32fb52769f07653b5696 Oct 01 16:21:55 crc kubenswrapper[4764]: I1001 16:21:55.738594 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-mmg6q"] Oct 01 16:21:56 crc kubenswrapper[4764]: I1001 16:21:56.331778 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:21:56 crc kubenswrapper[4764]: I1001 16:21:56.698839 4764 generic.go:334] "Generic (PLEG): container finished" podID="4a2a8617-a485-4228-94ff-874d395bc9a8" containerID="0225d91c2a2841e1744ca00766f8654e8fe4af4b16d139db76228ddaf04a72a4" exitCode=0 Oct 01 16:21:56 crc kubenswrapper[4764]: I1001 16:21:56.698939 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" event={"ID":"4a2a8617-a485-4228-94ff-874d395bc9a8","Type":"ContainerDied","Data":"0225d91c2a2841e1744ca00766f8654e8fe4af4b16d139db76228ddaf04a72a4"} Oct 01 16:21:56 crc kubenswrapper[4764]: I1001 16:21:56.699370 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" event={"ID":"4a2a8617-a485-4228-94ff-874d395bc9a8","Type":"ContainerStarted","Data":"d7d24143e3b9015fa62edb50a7f6c610f96fab99879b32fb52769f07653b5696"} Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.028238 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.028744 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="ceilometer-central-agent" containerID="cri-o://392dba833bacfd5bf974c5ad1e54ae0cd28af0a17d6d7d9303595d6d2900ad7b" gracePeriod=30 Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.028908 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="proxy-httpd" containerID="cri-o://a74fb18744ae197e44b8aa0650bbf96f420b16e2f2a04a67cc5f701b1705ca2a" gracePeriod=30 Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.029005 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="sg-core" containerID="cri-o://1f2c665a545383505f17971b6698b933f435f142ac19a532fa89b3c91126a9fc" gracePeriod=30 Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.029020 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="ceilometer-notification-agent" containerID="cri-o://560055be3fe8e1a63f2891836f506775679514e7aa5f3056c4025b5a3bd8f510" gracePeriod=30 Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.218204 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.709024 4764 generic.go:334] "Generic (PLEG): container finished" podID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerID="a74fb18744ae197e44b8aa0650bbf96f420b16e2f2a04a67cc5f701b1705ca2a" exitCode=0 Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.709158 4764 generic.go:334] "Generic (PLEG): container finished" podID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerID="1f2c665a545383505f17971b6698b933f435f142ac19a532fa89b3c91126a9fc" exitCode=2 Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.709170 4764 generic.go:334] "Generic (PLEG): container finished" podID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerID="392dba833bacfd5bf974c5ad1e54ae0cd28af0a17d6d7d9303595d6d2900ad7b" exitCode=0 Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.709110 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8729af9-a340-4aaf-8954-eb497e3c8d3d","Type":"ContainerDied","Data":"a74fb18744ae197e44b8aa0650bbf96f420b16e2f2a04a67cc5f701b1705ca2a"} Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.709239 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8729af9-a340-4aaf-8954-eb497e3c8d3d","Type":"ContainerDied","Data":"1f2c665a545383505f17971b6698b933f435f142ac19a532fa89b3c91126a9fc"} Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.709256 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8729af9-a340-4aaf-8954-eb497e3c8d3d","Type":"ContainerDied","Data":"392dba833bacfd5bf974c5ad1e54ae0cd28af0a17d6d7d9303595d6d2900ad7b"} Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.710746 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" event={"ID":"4a2a8617-a485-4228-94ff-874d395bc9a8","Type":"ContainerStarted","Data":"cb51c2b6fc6d3054fafd6f37dd9138053b9fbd9ac52acdd735e5d08c6a8565dd"} Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.710871 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="64b12b65-db76-4f95-9ecc-d17e714bd5aa" containerName="nova-api-log" containerID="cri-o://dbe1f3fbb26a402d6b7fa743574ff829af4f8e426e6352e1c46c5dd5ce2b5e36" gracePeriod=30 Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.710911 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="64b12b65-db76-4f95-9ecc-d17e714bd5aa" containerName="nova-api-api" containerID="cri-o://172dc32709476de3216b752690003da31eed3807c2657995d1344516f44e49d2" gracePeriod=30 Oct 01 16:21:57 crc kubenswrapper[4764]: I1001 16:21:57.738813 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" podStartSLOduration=3.738796583 podStartE2EDuration="3.738796583s" podCreationTimestamp="2025-10-01 16:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:21:57.738288699 +0000 UTC m=+1180.737935524" watchObservedRunningTime="2025-10-01 16:21:57.738796583 +0000 UTC m=+1180.738443418" Oct 01 16:21:58 crc kubenswrapper[4764]: I1001 16:21:58.720649 4764 generic.go:334] "Generic (PLEG): container finished" podID="64b12b65-db76-4f95-9ecc-d17e714bd5aa" containerID="dbe1f3fbb26a402d6b7fa743574ff829af4f8e426e6352e1c46c5dd5ce2b5e36" exitCode=143 Oct 01 16:21:58 crc kubenswrapper[4764]: I1001 16:21:58.720733 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"64b12b65-db76-4f95-9ecc-d17e714bd5aa","Type":"ContainerDied","Data":"dbe1f3fbb26a402d6b7fa743574ff829af4f8e426e6352e1c46c5dd5ce2b5e36"} Oct 01 16:21:58 crc kubenswrapper[4764]: I1001 16:21:58.722295 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.331826 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.348684 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.395467 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.448196 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64b12b65-db76-4f95-9ecc-d17e714bd5aa-logs\") pod \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.448249 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwszx\" (UniqueName: \"kubernetes.io/projected/64b12b65-db76-4f95-9ecc-d17e714bd5aa-kube-api-access-wwszx\") pod \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.448299 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b12b65-db76-4f95-9ecc-d17e714bd5aa-combined-ca-bundle\") pod \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.448372 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b12b65-db76-4f95-9ecc-d17e714bd5aa-config-data\") pod \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\" (UID: \"64b12b65-db76-4f95-9ecc-d17e714bd5aa\") " Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.449000 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64b12b65-db76-4f95-9ecc-d17e714bd5aa-logs" (OuterVolumeSpecName: "logs") pod "64b12b65-db76-4f95-9ecc-d17e714bd5aa" (UID: "64b12b65-db76-4f95-9ecc-d17e714bd5aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.468739 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b12b65-db76-4f95-9ecc-d17e714bd5aa-kube-api-access-wwszx" (OuterVolumeSpecName: "kube-api-access-wwszx") pod "64b12b65-db76-4f95-9ecc-d17e714bd5aa" (UID: "64b12b65-db76-4f95-9ecc-d17e714bd5aa"). InnerVolumeSpecName "kube-api-access-wwszx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.484148 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64b12b65-db76-4f95-9ecc-d17e714bd5aa-config-data" (OuterVolumeSpecName: "config-data") pod "64b12b65-db76-4f95-9ecc-d17e714bd5aa" (UID: "64b12b65-db76-4f95-9ecc-d17e714bd5aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.511710 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64b12b65-db76-4f95-9ecc-d17e714bd5aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64b12b65-db76-4f95-9ecc-d17e714bd5aa" (UID: "64b12b65-db76-4f95-9ecc-d17e714bd5aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.553460 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b12b65-db76-4f95-9ecc-d17e714bd5aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.553487 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b12b65-db76-4f95-9ecc-d17e714bd5aa-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.553497 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64b12b65-db76-4f95-9ecc-d17e714bd5aa-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.553505 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwszx\" (UniqueName: \"kubernetes.io/projected/64b12b65-db76-4f95-9ecc-d17e714bd5aa-kube-api-access-wwszx\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.678584 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.749642 4764 generic.go:334] "Generic (PLEG): container finished" podID="64b12b65-db76-4f95-9ecc-d17e714bd5aa" containerID="172dc32709476de3216b752690003da31eed3807c2657995d1344516f44e49d2" exitCode=0 Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.749685 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.749714 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"64b12b65-db76-4f95-9ecc-d17e714bd5aa","Type":"ContainerDied","Data":"172dc32709476de3216b752690003da31eed3807c2657995d1344516f44e49d2"} Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.749743 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"64b12b65-db76-4f95-9ecc-d17e714bd5aa","Type":"ContainerDied","Data":"bc888c01b06ac669c3fca59a21fc64213af2c104706df28e1edb8eba3183d4f4"} Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.749763 4764 scope.go:117] "RemoveContainer" containerID="172dc32709476de3216b752690003da31eed3807c2657995d1344516f44e49d2" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.758577 4764 generic.go:334] "Generic (PLEG): container finished" podID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerID="560055be3fe8e1a63f2891836f506775679514e7aa5f3056c4025b5a3bd8f510" exitCode=0 Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.758646 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-config-data\") pod \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.758819 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs2kv\" (UniqueName: \"kubernetes.io/projected/c8729af9-a340-4aaf-8954-eb497e3c8d3d-kube-api-access-cs2kv\") pod \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.759249 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8729af9-a340-4aaf-8954-eb497e3c8d3d-log-httpd\") pod \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.759267 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.759320 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-scripts\") pod \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.759367 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8729af9-a340-4aaf-8954-eb497e3c8d3d-run-httpd\") pod \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.759421 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-ceilometer-tls-certs\") pod \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.759476 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-sg-core-conf-yaml\") pod \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.759638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-combined-ca-bundle\") pod \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\" (UID: \"c8729af9-a340-4aaf-8954-eb497e3c8d3d\") " Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.759796 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8729af9-a340-4aaf-8954-eb497e3c8d3d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c8729af9-a340-4aaf-8954-eb497e3c8d3d" (UID: "c8729af9-a340-4aaf-8954-eb497e3c8d3d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.759835 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8729af9-a340-4aaf-8954-eb497e3c8d3d","Type":"ContainerDied","Data":"560055be3fe8e1a63f2891836f506775679514e7aa5f3056c4025b5a3bd8f510"} Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.759866 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8729af9-a340-4aaf-8954-eb497e3c8d3d","Type":"ContainerDied","Data":"1a6907d7d1599d4a0d684bd0e91f03f80c01205e68a562ccc807e433a1cf5916"} Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.760299 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8729af9-a340-4aaf-8954-eb497e3c8d3d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.760375 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8729af9-a340-4aaf-8954-eb497e3c8d3d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c8729af9-a340-4aaf-8954-eb497e3c8d3d" (UID: "c8729af9-a340-4aaf-8954-eb497e3c8d3d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.777512 4764 scope.go:117] "RemoveContainer" containerID="dbe1f3fbb26a402d6b7fa743574ff829af4f8e426e6352e1c46c5dd5ce2b5e36" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.779102 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.786151 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.810597 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8729af9-a340-4aaf-8954-eb497e3c8d3d-kube-api-access-cs2kv" (OuterVolumeSpecName: "kube-api-access-cs2kv") pod "c8729af9-a340-4aaf-8954-eb497e3c8d3d" (UID: "c8729af9-a340-4aaf-8954-eb497e3c8d3d"). InnerVolumeSpecName "kube-api-access-cs2kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.814135 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.825267 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-scripts" (OuterVolumeSpecName: "scripts") pod "c8729af9-a340-4aaf-8954-eb497e3c8d3d" (UID: "c8729af9-a340-4aaf-8954-eb497e3c8d3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.825995 4764 scope.go:117] "RemoveContainer" containerID="172dc32709476de3216b752690003da31eed3807c2657995d1344516f44e49d2" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.826142 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 16:22:01 crc kubenswrapper[4764]: E1001 16:22:01.826558 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b12b65-db76-4f95-9ecc-d17e714bd5aa" containerName="nova-api-log" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.826577 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b12b65-db76-4f95-9ecc-d17e714bd5aa" containerName="nova-api-log" Oct 01 16:22:01 crc kubenswrapper[4764]: E1001 16:22:01.826594 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="proxy-httpd" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.826601 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="proxy-httpd" Oct 01 16:22:01 crc kubenswrapper[4764]: E1001 16:22:01.828826 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="sg-core" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.828861 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="sg-core" Oct 01 16:22:01 crc kubenswrapper[4764]: E1001 16:22:01.828883 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="ceilometer-central-agent" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.828890 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="ceilometer-central-agent" Oct 01 16:22:01 crc kubenswrapper[4764]: E1001 16:22:01.828904 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="ceilometer-notification-agent" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.828911 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="ceilometer-notification-agent" Oct 01 16:22:01 crc kubenswrapper[4764]: E1001 16:22:01.829319 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b12b65-db76-4f95-9ecc-d17e714bd5aa" containerName="nova-api-api" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.829334 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b12b65-db76-4f95-9ecc-d17e714bd5aa" containerName="nova-api-api" Oct 01 16:22:01 crc kubenswrapper[4764]: E1001 16:22:01.830323 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"172dc32709476de3216b752690003da31eed3807c2657995d1344516f44e49d2\": container with ID starting with 172dc32709476de3216b752690003da31eed3807c2657995d1344516f44e49d2 not found: ID does not exist" containerID="172dc32709476de3216b752690003da31eed3807c2657995d1344516f44e49d2" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.830364 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"172dc32709476de3216b752690003da31eed3807c2657995d1344516f44e49d2"} err="failed to get container status \"172dc32709476de3216b752690003da31eed3807c2657995d1344516f44e49d2\": rpc error: code = NotFound desc = could not find container \"172dc32709476de3216b752690003da31eed3807c2657995d1344516f44e49d2\": container with ID starting with 172dc32709476de3216b752690003da31eed3807c2657995d1344516f44e49d2 not found: ID does not exist" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.830388 4764 scope.go:117] "RemoveContainer" containerID="dbe1f3fbb26a402d6b7fa743574ff829af4f8e426e6352e1c46c5dd5ce2b5e36" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.830576 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="sg-core" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.830590 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b12b65-db76-4f95-9ecc-d17e714bd5aa" containerName="nova-api-api" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.830608 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="proxy-httpd" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.830626 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="ceilometer-central-agent" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.830636 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b12b65-db76-4f95-9ecc-d17e714bd5aa" containerName="nova-api-log" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.830646 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" containerName="ceilometer-notification-agent" Oct 01 16:22:01 crc kubenswrapper[4764]: E1001 16:22:01.837608 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbe1f3fbb26a402d6b7fa743574ff829af4f8e426e6352e1c46c5dd5ce2b5e36\": container with ID starting with dbe1f3fbb26a402d6b7fa743574ff829af4f8e426e6352e1c46c5dd5ce2b5e36 not found: ID does not exist" containerID="dbe1f3fbb26a402d6b7fa743574ff829af4f8e426e6352e1c46c5dd5ce2b5e36" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.837665 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbe1f3fbb26a402d6b7fa743574ff829af4f8e426e6352e1c46c5dd5ce2b5e36"} err="failed to get container status \"dbe1f3fbb26a402d6b7fa743574ff829af4f8e426e6352e1c46c5dd5ce2b5e36\": rpc error: code = NotFound desc = could not find container \"dbe1f3fbb26a402d6b7fa743574ff829af4f8e426e6352e1c46c5dd5ce2b5e36\": container with ID starting with dbe1f3fbb26a402d6b7fa743574ff829af4f8e426e6352e1c46c5dd5ce2b5e36 not found: ID does not exist" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.837693 4764 scope.go:117] "RemoveContainer" containerID="a74fb18744ae197e44b8aa0650bbf96f420b16e2f2a04a67cc5f701b1705ca2a" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.838637 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.842478 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.843609 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.843662 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.843797 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.869456 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs2kv\" (UniqueName: \"kubernetes.io/projected/c8729af9-a340-4aaf-8954-eb497e3c8d3d-kube-api-access-cs2kv\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.869484 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8729af9-a340-4aaf-8954-eb497e3c8d3d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.869494 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.873656 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c8729af9-a340-4aaf-8954-eb497e3c8d3d" (UID: "c8729af9-a340-4aaf-8954-eb497e3c8d3d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.878645 4764 scope.go:117] "RemoveContainer" containerID="1f2c665a545383505f17971b6698b933f435f142ac19a532fa89b3c91126a9fc" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.890729 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c8729af9-a340-4aaf-8954-eb497e3c8d3d" (UID: "c8729af9-a340-4aaf-8954-eb497e3c8d3d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.914376 4764 scope.go:117] "RemoveContainer" containerID="560055be3fe8e1a63f2891836f506775679514e7aa5f3056c4025b5a3bd8f510" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.915738 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8729af9-a340-4aaf-8954-eb497e3c8d3d" (UID: "c8729af9-a340-4aaf-8954-eb497e3c8d3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.963517 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-plxmm"] Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.964738 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-plxmm"] Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.964818 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.998724 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs7vq\" (UniqueName: \"kubernetes.io/projected/2d832bc9-1a40-482e-bec6-f785dec32a1a-kube-api-access-xs7vq\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.998792 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zptpv\" (UniqueName: \"kubernetes.io/projected/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-kube-api-access-zptpv\") pod \"nova-cell1-cell-mapping-plxmm\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.998846 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.998911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-config-data\") pod \"nova-cell1-cell-mapping-plxmm\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.998944 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-scripts\") pod \"nova-cell1-cell-mapping-plxmm\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.998975 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-config-data\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.998994 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d832bc9-1a40-482e-bec6-f785dec32a1a-logs\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.999016 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.999033 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-plxmm\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.999079 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-public-tls-certs\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.999142 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.999152 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.999160 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:01 crc kubenswrapper[4764]: I1001 16:22:01.999775 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:01.999948 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.007758 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-config-data" (OuterVolumeSpecName: "config-data") pod "c8729af9-a340-4aaf-8954-eb497e3c8d3d" (UID: "c8729af9-a340-4aaf-8954-eb497e3c8d3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.025128 4764 scope.go:117] "RemoveContainer" containerID="392dba833bacfd5bf974c5ad1e54ae0cd28af0a17d6d7d9303595d6d2900ad7b" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.062690 4764 scope.go:117] "RemoveContainer" containerID="a74fb18744ae197e44b8aa0650bbf96f420b16e2f2a04a67cc5f701b1705ca2a" Oct 01 16:22:02 crc kubenswrapper[4764]: E1001 16:22:02.063505 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a74fb18744ae197e44b8aa0650bbf96f420b16e2f2a04a67cc5f701b1705ca2a\": container with ID starting with a74fb18744ae197e44b8aa0650bbf96f420b16e2f2a04a67cc5f701b1705ca2a not found: ID does not exist" containerID="a74fb18744ae197e44b8aa0650bbf96f420b16e2f2a04a67cc5f701b1705ca2a" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.063539 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a74fb18744ae197e44b8aa0650bbf96f420b16e2f2a04a67cc5f701b1705ca2a"} err="failed to get container status \"a74fb18744ae197e44b8aa0650bbf96f420b16e2f2a04a67cc5f701b1705ca2a\": rpc error: code = NotFound desc = could not find container \"a74fb18744ae197e44b8aa0650bbf96f420b16e2f2a04a67cc5f701b1705ca2a\": container with ID starting with a74fb18744ae197e44b8aa0650bbf96f420b16e2f2a04a67cc5f701b1705ca2a not found: ID does not exist" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.063561 4764 scope.go:117] "RemoveContainer" containerID="1f2c665a545383505f17971b6698b933f435f142ac19a532fa89b3c91126a9fc" Oct 01 16:22:02 crc kubenswrapper[4764]: E1001 16:22:02.064548 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f2c665a545383505f17971b6698b933f435f142ac19a532fa89b3c91126a9fc\": container with ID starting with 1f2c665a545383505f17971b6698b933f435f142ac19a532fa89b3c91126a9fc not found: ID does not exist" containerID="1f2c665a545383505f17971b6698b933f435f142ac19a532fa89b3c91126a9fc" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.064571 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f2c665a545383505f17971b6698b933f435f142ac19a532fa89b3c91126a9fc"} err="failed to get container status \"1f2c665a545383505f17971b6698b933f435f142ac19a532fa89b3c91126a9fc\": rpc error: code = NotFound desc = could not find container \"1f2c665a545383505f17971b6698b933f435f142ac19a532fa89b3c91126a9fc\": container with ID starting with 1f2c665a545383505f17971b6698b933f435f142ac19a532fa89b3c91126a9fc not found: ID does not exist" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.064585 4764 scope.go:117] "RemoveContainer" containerID="560055be3fe8e1a63f2891836f506775679514e7aa5f3056c4025b5a3bd8f510" Oct 01 16:22:02 crc kubenswrapper[4764]: E1001 16:22:02.064859 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560055be3fe8e1a63f2891836f506775679514e7aa5f3056c4025b5a3bd8f510\": container with ID starting with 560055be3fe8e1a63f2891836f506775679514e7aa5f3056c4025b5a3bd8f510 not found: ID does not exist" containerID="560055be3fe8e1a63f2891836f506775679514e7aa5f3056c4025b5a3bd8f510" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.064886 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560055be3fe8e1a63f2891836f506775679514e7aa5f3056c4025b5a3bd8f510"} err="failed to get container status \"560055be3fe8e1a63f2891836f506775679514e7aa5f3056c4025b5a3bd8f510\": rpc error: code = NotFound desc = could not find container \"560055be3fe8e1a63f2891836f506775679514e7aa5f3056c4025b5a3bd8f510\": container with ID starting with 560055be3fe8e1a63f2891836f506775679514e7aa5f3056c4025b5a3bd8f510 not found: ID does not exist" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.064910 4764 scope.go:117] "RemoveContainer" containerID="392dba833bacfd5bf974c5ad1e54ae0cd28af0a17d6d7d9303595d6d2900ad7b" Oct 01 16:22:02 crc kubenswrapper[4764]: E1001 16:22:02.065246 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"392dba833bacfd5bf974c5ad1e54ae0cd28af0a17d6d7d9303595d6d2900ad7b\": container with ID starting with 392dba833bacfd5bf974c5ad1e54ae0cd28af0a17d6d7d9303595d6d2900ad7b not found: ID does not exist" containerID="392dba833bacfd5bf974c5ad1e54ae0cd28af0a17d6d7d9303595d6d2900ad7b" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.065290 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"392dba833bacfd5bf974c5ad1e54ae0cd28af0a17d6d7d9303595d6d2900ad7b"} err="failed to get container status \"392dba833bacfd5bf974c5ad1e54ae0cd28af0a17d6d7d9303595d6d2900ad7b\": rpc error: code = NotFound desc = could not find container \"392dba833bacfd5bf974c5ad1e54ae0cd28af0a17d6d7d9303595d6d2900ad7b\": container with ID starting with 392dba833bacfd5bf974c5ad1e54ae0cd28af0a17d6d7d9303595d6d2900ad7b not found: ID does not exist" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.092960 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.104276 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs7vq\" (UniqueName: \"kubernetes.io/projected/2d832bc9-1a40-482e-bec6-f785dec32a1a-kube-api-access-xs7vq\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.104331 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zptpv\" (UniqueName: \"kubernetes.io/projected/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-kube-api-access-zptpv\") pod \"nova-cell1-cell-mapping-plxmm\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.104371 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.104416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-config-data\") pod \"nova-cell1-cell-mapping-plxmm\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.104449 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-scripts\") pod \"nova-cell1-cell-mapping-plxmm\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.104479 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-config-data\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.104501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d832bc9-1a40-482e-bec6-f785dec32a1a-logs\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.104531 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.104548 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-plxmm\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.104572 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-public-tls-certs\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.104680 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8729af9-a340-4aaf-8954-eb497e3c8d3d-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.109141 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.109526 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d832bc9-1a40-482e-bec6-f785dec32a1a-logs\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.114958 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-config-data\") pod \"nova-cell1-cell-mapping-plxmm\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.119006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.119095 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-public-tls-certs\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.119560 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-scripts\") pod \"nova-cell1-cell-mapping-plxmm\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.119709 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-config-data\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.129145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.130298 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.132480 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.135500 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.136088 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.136569 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs7vq\" (UniqueName: \"kubernetes.io/projected/2d832bc9-1a40-482e-bec6-f785dec32a1a-kube-api-access-xs7vq\") pod \"nova-api-0\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " pod="openstack/nova-api-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.138026 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.138428 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zptpv\" (UniqueName: \"kubernetes.io/projected/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-kube-api-access-zptpv\") pod \"nova-cell1-cell-mapping-plxmm\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.146821 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.147771 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-plxmm\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.174954 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.307186 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.307239 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dee403a9-1cba-407b-9235-187a8553761d-run-httpd\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.307308 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.307328 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-config-data\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.307349 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.307417 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dee403a9-1cba-407b-9235-187a8553761d-log-httpd\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.307433 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfzcr\" (UniqueName: \"kubernetes.io/projected/dee403a9-1cba-407b-9235-187a8553761d-kube-api-access-jfzcr\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.307468 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-scripts\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.364165 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.409119 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dee403a9-1cba-407b-9235-187a8553761d-log-httpd\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.409165 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfzcr\" (UniqueName: \"kubernetes.io/projected/dee403a9-1cba-407b-9235-187a8553761d-kube-api-access-jfzcr\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.409219 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-scripts\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.409583 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dee403a9-1cba-407b-9235-187a8553761d-log-httpd\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.409264 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.410090 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dee403a9-1cba-407b-9235-187a8553761d-run-httpd\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.410170 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.410189 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-config-data\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.410223 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.410509 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dee403a9-1cba-407b-9235-187a8553761d-run-httpd\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.413177 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.413481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.427523 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.427967 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-config-data\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.428120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-scripts\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.430731 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfzcr\" (UniqueName: \"kubernetes.io/projected/dee403a9-1cba-407b-9235-187a8553761d-kube-api-access-jfzcr\") pod \"ceilometer-0\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.565923 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.609179 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.773577 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d832bc9-1a40-482e-bec6-f785dec32a1a","Type":"ContainerStarted","Data":"716ab24aad910266fe49eebba03a60f848290f3b834f4ac55620d46ec2ed2c69"} Oct 01 16:22:02 crc kubenswrapper[4764]: I1001 16:22:02.838979 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-plxmm"] Oct 01 16:22:03 crc kubenswrapper[4764]: I1001 16:22:03.024872 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:22:03 crc kubenswrapper[4764]: W1001 16:22:03.029025 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddee403a9_1cba_407b_9235_187a8553761d.slice/crio-bb98e5eb03e5d1ffe0c479129fc3720df8864e5df7aa54920d4db978d399b55e WatchSource:0}: Error finding container bb98e5eb03e5d1ffe0c479129fc3720df8864e5df7aa54920d4db978d399b55e: Status 404 returned error can't find the container with id bb98e5eb03e5d1ffe0c479129fc3720df8864e5df7aa54920d4db978d399b55e Oct 01 16:22:03 crc kubenswrapper[4764]: I1001 16:22:03.732564 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64b12b65-db76-4f95-9ecc-d17e714bd5aa" path="/var/lib/kubelet/pods/64b12b65-db76-4f95-9ecc-d17e714bd5aa/volumes" Oct 01 16:22:03 crc kubenswrapper[4764]: I1001 16:22:03.734900 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8729af9-a340-4aaf-8954-eb497e3c8d3d" path="/var/lib/kubelet/pods/c8729af9-a340-4aaf-8954-eb497e3c8d3d/volumes" Oct 01 16:22:03 crc kubenswrapper[4764]: I1001 16:22:03.793981 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dee403a9-1cba-407b-9235-187a8553761d","Type":"ContainerStarted","Data":"bb98e5eb03e5d1ffe0c479129fc3720df8864e5df7aa54920d4db978d399b55e"} Oct 01 16:22:03 crc kubenswrapper[4764]: I1001 16:22:03.796278 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d832bc9-1a40-482e-bec6-f785dec32a1a","Type":"ContainerStarted","Data":"88b1426559b98be5ce9b8e93e655cdafb563ecc05bb0427370baf013853e5292"} Oct 01 16:22:03 crc kubenswrapper[4764]: I1001 16:22:03.796307 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d832bc9-1a40-482e-bec6-f785dec32a1a","Type":"ContainerStarted","Data":"cca84d56b5ca81e9471bed13e82781c34d4a956d21e8e27a8804035180c75e2e"} Oct 01 16:22:03 crc kubenswrapper[4764]: I1001 16:22:03.803565 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-plxmm" event={"ID":"5bfc22be-0529-4b69-b782-c21bdd4fdaa6","Type":"ContainerStarted","Data":"0b10634bad5c7a3310f86837fad12ef9e8f0631d15248903281ab255943c2654"} Oct 01 16:22:03 crc kubenswrapper[4764]: I1001 16:22:03.803735 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-plxmm" event={"ID":"5bfc22be-0529-4b69-b782-c21bdd4fdaa6","Type":"ContainerStarted","Data":"919e97d9c3c021142ff4b1dc0724356d01f832e0a633efcebcbf0b43f5afb717"} Oct 01 16:22:03 crc kubenswrapper[4764]: I1001 16:22:03.825749 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.825733885 podStartE2EDuration="2.825733885s" podCreationTimestamp="2025-10-01 16:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:22:03.81698494 +0000 UTC m=+1186.816631795" watchObservedRunningTime="2025-10-01 16:22:03.825733885 +0000 UTC m=+1186.825380720" Oct 01 16:22:03 crc kubenswrapper[4764]: I1001 16:22:03.836191 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-plxmm" podStartSLOduration=2.836177203 podStartE2EDuration="2.836177203s" podCreationTimestamp="2025-10-01 16:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:22:03.835721501 +0000 UTC m=+1186.835368346" watchObservedRunningTime="2025-10-01 16:22:03.836177203 +0000 UTC m=+1186.835824038" Oct 01 16:22:04 crc kubenswrapper[4764]: I1001 16:22:04.814404 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dee403a9-1cba-407b-9235-187a8553761d","Type":"ContainerStarted","Data":"ad1413944c0e57d62005405d20adc25e72f611b869671fd6b77b759aed6621cb"} Oct 01 16:22:05 crc kubenswrapper[4764]: I1001 16:22:05.266186 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:22:05 crc kubenswrapper[4764]: I1001 16:22:05.342030 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-zmqd5"] Oct 01 16:22:05 crc kubenswrapper[4764]: I1001 16:22:05.342300 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" podUID="75884291-058a-479e-9c6f-9880d64900fe" containerName="dnsmasq-dns" containerID="cri-o://45aa2458a483632cefb43d43e7a67e090d8da9c60c74ff2d5fa6f1185dd8b6be" gracePeriod=10 Oct 01 16:22:05 crc kubenswrapper[4764]: I1001 16:22:05.825386 4764 generic.go:334] "Generic (PLEG): container finished" podID="75884291-058a-479e-9c6f-9880d64900fe" containerID="45aa2458a483632cefb43d43e7a67e090d8da9c60c74ff2d5fa6f1185dd8b6be" exitCode=0 Oct 01 16:22:05 crc kubenswrapper[4764]: I1001 16:22:05.825446 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" event={"ID":"75884291-058a-479e-9c6f-9880d64900fe","Type":"ContainerDied","Data":"45aa2458a483632cefb43d43e7a67e090d8da9c60c74ff2d5fa6f1185dd8b6be"} Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.025117 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" podUID="75884291-058a-479e-9c6f-9880d64900fe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.174:5353: connect: connection refused" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.297322 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.399935 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdfgl\" (UniqueName: \"kubernetes.io/projected/75884291-058a-479e-9c6f-9880d64900fe-kube-api-access-mdfgl\") pod \"75884291-058a-479e-9c6f-9880d64900fe\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.400007 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-config\") pod \"75884291-058a-479e-9c6f-9880d64900fe\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.400081 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-dns-svc\") pod \"75884291-058a-479e-9c6f-9880d64900fe\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.400152 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-ovsdbserver-nb\") pod \"75884291-058a-479e-9c6f-9880d64900fe\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.400192 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-ovsdbserver-sb\") pod \"75884291-058a-479e-9c6f-9880d64900fe\" (UID: \"75884291-058a-479e-9c6f-9880d64900fe\") " Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.405240 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75884291-058a-479e-9c6f-9880d64900fe-kube-api-access-mdfgl" (OuterVolumeSpecName: "kube-api-access-mdfgl") pod "75884291-058a-479e-9c6f-9880d64900fe" (UID: "75884291-058a-479e-9c6f-9880d64900fe"). InnerVolumeSpecName "kube-api-access-mdfgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.450445 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75884291-058a-479e-9c6f-9880d64900fe" (UID: "75884291-058a-479e-9c6f-9880d64900fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.451508 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75884291-058a-479e-9c6f-9880d64900fe" (UID: "75884291-058a-479e-9c6f-9880d64900fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.460568 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-config" (OuterVolumeSpecName: "config") pod "75884291-058a-479e-9c6f-9880d64900fe" (UID: "75884291-058a-479e-9c6f-9880d64900fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.472476 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75884291-058a-479e-9c6f-9880d64900fe" (UID: "75884291-058a-479e-9c6f-9880d64900fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.502743 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.503006 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.503018 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdfgl\" (UniqueName: \"kubernetes.io/projected/75884291-058a-479e-9c6f-9880d64900fe-kube-api-access-mdfgl\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.503030 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.503060 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75884291-058a-479e-9c6f-9880d64900fe-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.845020 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" event={"ID":"75884291-058a-479e-9c6f-9880d64900fe","Type":"ContainerDied","Data":"476a656e1d674883d9ddbb8552292d2d747be09387f39173fa1ce7af97350a4a"} Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.845212 4764 scope.go:117] "RemoveContainer" containerID="45aa2458a483632cefb43d43e7a67e090d8da9c60c74ff2d5fa6f1185dd8b6be" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.845400 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-zmqd5" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.850892 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dee403a9-1cba-407b-9235-187a8553761d","Type":"ContainerStarted","Data":"3a823132482b25719c0252dfc36b189fcb49373c7366e35a99ce58342229cf82"} Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.850943 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dee403a9-1cba-407b-9235-187a8553761d","Type":"ContainerStarted","Data":"ad837dda32772c7267432ea5793da46b58b872080a91f39cf12568664f252023"} Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.876089 4764 scope.go:117] "RemoveContainer" containerID="b11f1651486853ec9e9cd33d78cb0721eb7ac54b38c8b8f9380a2a3fd6873276" Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.889386 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-zmqd5"] Oct 01 16:22:06 crc kubenswrapper[4764]: I1001 16:22:06.895130 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-zmqd5"] Oct 01 16:22:07 crc kubenswrapper[4764]: I1001 16:22:07.738597 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75884291-058a-479e-9c6f-9880d64900fe" path="/var/lib/kubelet/pods/75884291-058a-479e-9c6f-9880d64900fe/volumes" Oct 01 16:22:08 crc kubenswrapper[4764]: I1001 16:22:08.870209 4764 generic.go:334] "Generic (PLEG): container finished" podID="5bfc22be-0529-4b69-b782-c21bdd4fdaa6" containerID="0b10634bad5c7a3310f86837fad12ef9e8f0631d15248903281ab255943c2654" exitCode=0 Oct 01 16:22:08 crc kubenswrapper[4764]: I1001 16:22:08.870288 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-plxmm" event={"ID":"5bfc22be-0529-4b69-b782-c21bdd4fdaa6","Type":"ContainerDied","Data":"0b10634bad5c7a3310f86837fad12ef9e8f0631d15248903281ab255943c2654"} Oct 01 16:22:09 crc kubenswrapper[4764]: I1001 16:22:09.910233 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dee403a9-1cba-407b-9235-187a8553761d","Type":"ContainerStarted","Data":"35b16b12861892a707ba8d2e07d2c3cee27c29e43f6be0e048fb012bb282f9f2"} Oct 01 16:22:09 crc kubenswrapper[4764]: I1001 16:22:09.911114 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 16:22:09 crc kubenswrapper[4764]: I1001 16:22:09.950205 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.024069079 podStartE2EDuration="7.950182661s" podCreationTimestamp="2025-10-01 16:22:02 +0000 UTC" firstStartedPulling="2025-10-01 16:22:03.031400806 +0000 UTC m=+1186.031047641" lastFinishedPulling="2025-10-01 16:22:08.957514378 +0000 UTC m=+1191.957161223" observedRunningTime="2025-10-01 16:22:09.934873384 +0000 UTC m=+1192.934520229" watchObservedRunningTime="2025-10-01 16:22:09.950182661 +0000 UTC m=+1192.949829506" Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.314029 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.496947 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-combined-ca-bundle\") pod \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.497027 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-config-data\") pod \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.497163 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-scripts\") pod \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.497205 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zptpv\" (UniqueName: \"kubernetes.io/projected/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-kube-api-access-zptpv\") pod \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\" (UID: \"5bfc22be-0529-4b69-b782-c21bdd4fdaa6\") " Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.503225 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-scripts" (OuterVolumeSpecName: "scripts") pod "5bfc22be-0529-4b69-b782-c21bdd4fdaa6" (UID: "5bfc22be-0529-4b69-b782-c21bdd4fdaa6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.503910 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-kube-api-access-zptpv" (OuterVolumeSpecName: "kube-api-access-zptpv") pod "5bfc22be-0529-4b69-b782-c21bdd4fdaa6" (UID: "5bfc22be-0529-4b69-b782-c21bdd4fdaa6"). InnerVolumeSpecName "kube-api-access-zptpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.532341 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-config-data" (OuterVolumeSpecName: "config-data") pod "5bfc22be-0529-4b69-b782-c21bdd4fdaa6" (UID: "5bfc22be-0529-4b69-b782-c21bdd4fdaa6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.544625 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bfc22be-0529-4b69-b782-c21bdd4fdaa6" (UID: "5bfc22be-0529-4b69-b782-c21bdd4fdaa6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.599474 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.599802 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zptpv\" (UniqueName: \"kubernetes.io/projected/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-kube-api-access-zptpv\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.599897 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.599986 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfc22be-0529-4b69-b782-c21bdd4fdaa6-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.934251 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-plxmm" Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.934233 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-plxmm" event={"ID":"5bfc22be-0529-4b69-b782-c21bdd4fdaa6","Type":"ContainerDied","Data":"919e97d9c3c021142ff4b1dc0724356d01f832e0a633efcebcbf0b43f5afb717"} Oct 01 16:22:10 crc kubenswrapper[4764]: I1001 16:22:10.934332 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="919e97d9c3c021142ff4b1dc0724356d01f832e0a633efcebcbf0b43f5afb717" Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.080513 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.080790 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2d832bc9-1a40-482e-bec6-f785dec32a1a" containerName="nova-api-log" containerID="cri-o://cca84d56b5ca81e9471bed13e82781c34d4a956d21e8e27a8804035180c75e2e" gracePeriod=30 Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.080947 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2d832bc9-1a40-482e-bec6-f785dec32a1a" containerName="nova-api-api" containerID="cri-o://88b1426559b98be5ce9b8e93e655cdafb563ecc05bb0427370baf013853e5292" gracePeriod=30 Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.107118 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.107381 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="60ca659a-c997-4901-abd9-0200e6f16aea" containerName="nova-scheduler-scheduler" containerID="cri-o://5ae2bc29b554eff839b17a07ba1cc6a024038d7d8bf337f3c17debe4d17a2a8e" gracePeriod=30 Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.133253 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.133551 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerName="nova-metadata-log" containerID="cri-o://e436bd0705817adb68f3bd582b5ab2cc47f55323354a127aa08de20bd27d32b4" gracePeriod=30 Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.133712 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerName="nova-metadata-metadata" containerID="cri-o://71e53d85bb1c680d27d2413f311b751f562559fafa95232122abdf85cfbf8c8f" gracePeriod=30 Oct 01 16:22:11 crc kubenswrapper[4764]: E1001 16:22:11.823455 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ae2bc29b554eff839b17a07ba1cc6a024038d7d8bf337f3c17debe4d17a2a8e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 16:22:11 crc kubenswrapper[4764]: E1001 16:22:11.825636 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ae2bc29b554eff839b17a07ba1cc6a024038d7d8bf337f3c17debe4d17a2a8e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 16:22:11 crc kubenswrapper[4764]: E1001 16:22:11.827643 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ae2bc29b554eff839b17a07ba1cc6a024038d7d8bf337f3c17debe4d17a2a8e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 16:22:11 crc kubenswrapper[4764]: E1001 16:22:11.827698 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="60ca659a-c997-4901-abd9-0200e6f16aea" containerName="nova-scheduler-scheduler" Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.951825 4764 generic.go:334] "Generic (PLEG): container finished" podID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerID="e436bd0705817adb68f3bd582b5ab2cc47f55323354a127aa08de20bd27d32b4" exitCode=143 Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.951912 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d905f1ae-9a08-4050-8661-c7069a8d8b83","Type":"ContainerDied","Data":"e436bd0705817adb68f3bd582b5ab2cc47f55323354a127aa08de20bd27d32b4"} Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.958084 4764 generic.go:334] "Generic (PLEG): container finished" podID="2d832bc9-1a40-482e-bec6-f785dec32a1a" containerID="88b1426559b98be5ce9b8e93e655cdafb563ecc05bb0427370baf013853e5292" exitCode=0 Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.958111 4764 generic.go:334] "Generic (PLEG): container finished" podID="2d832bc9-1a40-482e-bec6-f785dec32a1a" containerID="cca84d56b5ca81e9471bed13e82781c34d4a956d21e8e27a8804035180c75e2e" exitCode=143 Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.958131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d832bc9-1a40-482e-bec6-f785dec32a1a","Type":"ContainerDied","Data":"88b1426559b98be5ce9b8e93e655cdafb563ecc05bb0427370baf013853e5292"} Oct 01 16:22:11 crc kubenswrapper[4764]: I1001 16:22:11.958150 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d832bc9-1a40-482e-bec6-f785dec32a1a","Type":"ContainerDied","Data":"cca84d56b5ca81e9471bed13e82781c34d4a956d21e8e27a8804035180c75e2e"} Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.265522 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.339500 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-internal-tls-certs\") pod \"2d832bc9-1a40-482e-bec6-f785dec32a1a\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.339593 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-config-data\") pod \"2d832bc9-1a40-482e-bec6-f785dec32a1a\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.339637 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs7vq\" (UniqueName: \"kubernetes.io/projected/2d832bc9-1a40-482e-bec6-f785dec32a1a-kube-api-access-xs7vq\") pod \"2d832bc9-1a40-482e-bec6-f785dec32a1a\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.339712 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d832bc9-1a40-482e-bec6-f785dec32a1a-logs\") pod \"2d832bc9-1a40-482e-bec6-f785dec32a1a\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.339793 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-combined-ca-bundle\") pod \"2d832bc9-1a40-482e-bec6-f785dec32a1a\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.339908 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-public-tls-certs\") pod \"2d832bc9-1a40-482e-bec6-f785dec32a1a\" (UID: \"2d832bc9-1a40-482e-bec6-f785dec32a1a\") " Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.340119 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d832bc9-1a40-482e-bec6-f785dec32a1a-logs" (OuterVolumeSpecName: "logs") pod "2d832bc9-1a40-482e-bec6-f785dec32a1a" (UID: "2d832bc9-1a40-482e-bec6-f785dec32a1a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.340623 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d832bc9-1a40-482e-bec6-f785dec32a1a-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.345517 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d832bc9-1a40-482e-bec6-f785dec32a1a-kube-api-access-xs7vq" (OuterVolumeSpecName: "kube-api-access-xs7vq") pod "2d832bc9-1a40-482e-bec6-f785dec32a1a" (UID: "2d832bc9-1a40-482e-bec6-f785dec32a1a"). InnerVolumeSpecName "kube-api-access-xs7vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.375265 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-config-data" (OuterVolumeSpecName: "config-data") pod "2d832bc9-1a40-482e-bec6-f785dec32a1a" (UID: "2d832bc9-1a40-482e-bec6-f785dec32a1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.381133 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d832bc9-1a40-482e-bec6-f785dec32a1a" (UID: "2d832bc9-1a40-482e-bec6-f785dec32a1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.386530 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2d832bc9-1a40-482e-bec6-f785dec32a1a" (UID: "2d832bc9-1a40-482e-bec6-f785dec32a1a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.417618 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2d832bc9-1a40-482e-bec6-f785dec32a1a" (UID: "2d832bc9-1a40-482e-bec6-f785dec32a1a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.442572 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.442602 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.442612 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.442622 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d832bc9-1a40-482e-bec6-f785dec32a1a-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.442632 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs7vq\" (UniqueName: \"kubernetes.io/projected/2d832bc9-1a40-482e-bec6-f785dec32a1a-kube-api-access-xs7vq\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.974589 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2d832bc9-1a40-482e-bec6-f785dec32a1a","Type":"ContainerDied","Data":"716ab24aad910266fe49eebba03a60f848290f3b834f4ac55620d46ec2ed2c69"} Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.974687 4764 scope.go:117] "RemoveContainer" containerID="88b1426559b98be5ce9b8e93e655cdafb563ecc05bb0427370baf013853e5292" Oct 01 16:22:12 crc kubenswrapper[4764]: I1001 16:22:12.974704 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.055694 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.065523 4764 scope.go:117] "RemoveContainer" containerID="cca84d56b5ca81e9471bed13e82781c34d4a956d21e8e27a8804035180c75e2e" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.070834 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.094438 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 16:22:13 crc kubenswrapper[4764]: E1001 16:22:13.095382 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bfc22be-0529-4b69-b782-c21bdd4fdaa6" containerName="nova-manage" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.095616 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bfc22be-0529-4b69-b782-c21bdd4fdaa6" containerName="nova-manage" Oct 01 16:22:13 crc kubenswrapper[4764]: E1001 16:22:13.095898 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75884291-058a-479e-9c6f-9880d64900fe" containerName="dnsmasq-dns" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.096024 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="75884291-058a-479e-9c6f-9880d64900fe" containerName="dnsmasq-dns" Oct 01 16:22:13 crc kubenswrapper[4764]: E1001 16:22:13.096570 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75884291-058a-479e-9c6f-9880d64900fe" containerName="init" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.096902 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="75884291-058a-479e-9c6f-9880d64900fe" containerName="init" Oct 01 16:22:13 crc kubenswrapper[4764]: E1001 16:22:13.097113 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d832bc9-1a40-482e-bec6-f785dec32a1a" containerName="nova-api-api" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.097278 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d832bc9-1a40-482e-bec6-f785dec32a1a" containerName="nova-api-api" Oct 01 16:22:13 crc kubenswrapper[4764]: E1001 16:22:13.097626 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d832bc9-1a40-482e-bec6-f785dec32a1a" containerName="nova-api-log" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.097745 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d832bc9-1a40-482e-bec6-f785dec32a1a" containerName="nova-api-log" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.098177 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d832bc9-1a40-482e-bec6-f785dec32a1a" containerName="nova-api-log" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.098369 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="75884291-058a-479e-9c6f-9880d64900fe" containerName="dnsmasq-dns" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.098519 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d832bc9-1a40-482e-bec6-f785dec32a1a" containerName="nova-api-api" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.098650 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bfc22be-0529-4b69-b782-c21bdd4fdaa6" containerName="nova-manage" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.103728 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.107591 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.107788 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.107904 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.117337 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.156492 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/445abcb4-96ed-403c-bf18-0c1bc5440182-config-data\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.156833 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/445abcb4-96ed-403c-bf18-0c1bc5440182-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.156933 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/445abcb4-96ed-403c-bf18-0c1bc5440182-internal-tls-certs\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.157079 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/445abcb4-96ed-403c-bf18-0c1bc5440182-public-tls-certs\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.157210 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/445abcb4-96ed-403c-bf18-0c1bc5440182-logs\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.157300 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7cnl\" (UniqueName: \"kubernetes.io/projected/445abcb4-96ed-403c-bf18-0c1bc5440182-kube-api-access-f7cnl\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.259458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/445abcb4-96ed-403c-bf18-0c1bc5440182-public-tls-certs\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.259613 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/445abcb4-96ed-403c-bf18-0c1bc5440182-logs\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.259681 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7cnl\" (UniqueName: \"kubernetes.io/projected/445abcb4-96ed-403c-bf18-0c1bc5440182-kube-api-access-f7cnl\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.259735 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/445abcb4-96ed-403c-bf18-0c1bc5440182-config-data\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.259852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/445abcb4-96ed-403c-bf18-0c1bc5440182-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.259912 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/445abcb4-96ed-403c-bf18-0c1bc5440182-internal-tls-certs\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.260345 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/445abcb4-96ed-403c-bf18-0c1bc5440182-logs\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.263596 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/445abcb4-96ed-403c-bf18-0c1bc5440182-public-tls-certs\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.263966 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/445abcb4-96ed-403c-bf18-0c1bc5440182-config-data\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.265372 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/445abcb4-96ed-403c-bf18-0c1bc5440182-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.266163 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/445abcb4-96ed-403c-bf18-0c1bc5440182-internal-tls-certs\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.277451 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7cnl\" (UniqueName: \"kubernetes.io/projected/445abcb4-96ed-403c-bf18-0c1bc5440182-kube-api-access-f7cnl\") pod \"nova-api-0\" (UID: \"445abcb4-96ed-403c-bf18-0c1bc5440182\") " pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.432283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.747084 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d832bc9-1a40-482e-bec6-f785dec32a1a" path="/var/lib/kubelet/pods/2d832bc9-1a40-482e-bec6-f785dec32a1a/volumes" Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.934502 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 16:22:13 crc kubenswrapper[4764]: I1001 16:22:13.991322 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"445abcb4-96ed-403c-bf18-0c1bc5440182","Type":"ContainerStarted","Data":"c713f528cbe197c2dadab16dae498de49b9c3aebd2c05d9d0e874e39b3a5e357"} Oct 01 16:22:14 crc kubenswrapper[4764]: I1001 16:22:14.283635 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.179:8775/\": read tcp 10.217.0.2:37124->10.217.0.179:8775: read: connection reset by peer" Oct 01 16:22:14 crc kubenswrapper[4764]: I1001 16:22:14.283683 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.179:8775/\": read tcp 10.217.0.2:37122->10.217.0.179:8775: read: connection reset by peer" Oct 01 16:22:14 crc kubenswrapper[4764]: I1001 16:22:14.962026 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.005717 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d905f1ae-9a08-4050-8661-c7069a8d8b83-logs\") pod \"d905f1ae-9a08-4050-8661-c7069a8d8b83\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.005856 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpfdm\" (UniqueName: \"kubernetes.io/projected/d905f1ae-9a08-4050-8661-c7069a8d8b83-kube-api-access-jpfdm\") pod \"d905f1ae-9a08-4050-8661-c7069a8d8b83\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.005896 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-config-data\") pod \"d905f1ae-9a08-4050-8661-c7069a8d8b83\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.005948 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-nova-metadata-tls-certs\") pod \"d905f1ae-9a08-4050-8661-c7069a8d8b83\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.005995 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-combined-ca-bundle\") pod \"d905f1ae-9a08-4050-8661-c7069a8d8b83\" (UID: \"d905f1ae-9a08-4050-8661-c7069a8d8b83\") " Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.006323 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d905f1ae-9a08-4050-8661-c7069a8d8b83-logs" (OuterVolumeSpecName: "logs") pod "d905f1ae-9a08-4050-8661-c7069a8d8b83" (UID: "d905f1ae-9a08-4050-8661-c7069a8d8b83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.006682 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d905f1ae-9a08-4050-8661-c7069a8d8b83-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.014119 4764 generic.go:334] "Generic (PLEG): container finished" podID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerID="71e53d85bb1c680d27d2413f311b751f562559fafa95232122abdf85cfbf8c8f" exitCode=0 Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.014537 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.016107 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d905f1ae-9a08-4050-8661-c7069a8d8b83","Type":"ContainerDied","Data":"71e53d85bb1c680d27d2413f311b751f562559fafa95232122abdf85cfbf8c8f"} Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.016158 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d905f1ae-9a08-4050-8661-c7069a8d8b83","Type":"ContainerDied","Data":"4bed8cf3a22a2b91d9f383ed0e21181d84cfbfa2d5a4ca61519d72c1aeaa10a1"} Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.016188 4764 scope.go:117] "RemoveContainer" containerID="71e53d85bb1c680d27d2413f311b751f562559fafa95232122abdf85cfbf8c8f" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.017424 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d905f1ae-9a08-4050-8661-c7069a8d8b83-kube-api-access-jpfdm" (OuterVolumeSpecName: "kube-api-access-jpfdm") pod "d905f1ae-9a08-4050-8661-c7069a8d8b83" (UID: "d905f1ae-9a08-4050-8661-c7069a8d8b83"). InnerVolumeSpecName "kube-api-access-jpfdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.021629 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"445abcb4-96ed-403c-bf18-0c1bc5440182","Type":"ContainerStarted","Data":"684e637e720a6f71ff62ed7755612d16a4f7fb93265b060e7f6e642adf2d37a9"} Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.021690 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"445abcb4-96ed-403c-bf18-0c1bc5440182","Type":"ContainerStarted","Data":"72f5b844e4428be3dd7388e7d01c737b355d770b075c8e3618fcf4179ac7db56"} Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.055166 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.055084912 podStartE2EDuration="2.055084912s" podCreationTimestamp="2025-10-01 16:22:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:22:15.051471554 +0000 UTC m=+1198.051118399" watchObservedRunningTime="2025-10-01 16:22:15.055084912 +0000 UTC m=+1198.054731747" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.062432 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d905f1ae-9a08-4050-8661-c7069a8d8b83" (UID: "d905f1ae-9a08-4050-8661-c7069a8d8b83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.075104 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-config-data" (OuterVolumeSpecName: "config-data") pod "d905f1ae-9a08-4050-8661-c7069a8d8b83" (UID: "d905f1ae-9a08-4050-8661-c7069a8d8b83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.110131 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpfdm\" (UniqueName: \"kubernetes.io/projected/d905f1ae-9a08-4050-8661-c7069a8d8b83-kube-api-access-jpfdm\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.110160 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.110170 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.136262 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d905f1ae-9a08-4050-8661-c7069a8d8b83" (UID: "d905f1ae-9a08-4050-8661-c7069a8d8b83"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.137594 4764 scope.go:117] "RemoveContainer" containerID="e436bd0705817adb68f3bd582b5ab2cc47f55323354a127aa08de20bd27d32b4" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.156992 4764 scope.go:117] "RemoveContainer" containerID="71e53d85bb1c680d27d2413f311b751f562559fafa95232122abdf85cfbf8c8f" Oct 01 16:22:15 crc kubenswrapper[4764]: E1001 16:22:15.158336 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e53d85bb1c680d27d2413f311b751f562559fafa95232122abdf85cfbf8c8f\": container with ID starting with 71e53d85bb1c680d27d2413f311b751f562559fafa95232122abdf85cfbf8c8f not found: ID does not exist" containerID="71e53d85bb1c680d27d2413f311b751f562559fafa95232122abdf85cfbf8c8f" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.158436 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e53d85bb1c680d27d2413f311b751f562559fafa95232122abdf85cfbf8c8f"} err="failed to get container status \"71e53d85bb1c680d27d2413f311b751f562559fafa95232122abdf85cfbf8c8f\": rpc error: code = NotFound desc = could not find container \"71e53d85bb1c680d27d2413f311b751f562559fafa95232122abdf85cfbf8c8f\": container with ID starting with 71e53d85bb1c680d27d2413f311b751f562559fafa95232122abdf85cfbf8c8f not found: ID does not exist" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.158525 4764 scope.go:117] "RemoveContainer" containerID="e436bd0705817adb68f3bd582b5ab2cc47f55323354a127aa08de20bd27d32b4" Oct 01 16:22:15 crc kubenswrapper[4764]: E1001 16:22:15.158825 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e436bd0705817adb68f3bd582b5ab2cc47f55323354a127aa08de20bd27d32b4\": container with ID starting with e436bd0705817adb68f3bd582b5ab2cc47f55323354a127aa08de20bd27d32b4 not found: ID does not exist" containerID="e436bd0705817adb68f3bd582b5ab2cc47f55323354a127aa08de20bd27d32b4" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.158845 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e436bd0705817adb68f3bd582b5ab2cc47f55323354a127aa08de20bd27d32b4"} err="failed to get container status \"e436bd0705817adb68f3bd582b5ab2cc47f55323354a127aa08de20bd27d32b4\": rpc error: code = NotFound desc = could not find container \"e436bd0705817adb68f3bd582b5ab2cc47f55323354a127aa08de20bd27d32b4\": container with ID starting with e436bd0705817adb68f3bd582b5ab2cc47f55323354a127aa08de20bd27d32b4 not found: ID does not exist" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.211235 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d905f1ae-9a08-4050-8661-c7069a8d8b83-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.352659 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.367160 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.396021 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:22:15 crc kubenswrapper[4764]: E1001 16:22:15.396493 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerName="nova-metadata-metadata" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.396514 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerName="nova-metadata-metadata" Oct 01 16:22:15 crc kubenswrapper[4764]: E1001 16:22:15.396529 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerName="nova-metadata-log" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.396537 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerName="nova-metadata-log" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.396763 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerName="nova-metadata-log" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.396781 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d905f1ae-9a08-4050-8661-c7069a8d8b83" containerName="nova-metadata-metadata" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.397697 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.403184 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.403252 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.413911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faef6b6-44c8-4251-981a-ca6f0eddeda1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.413966 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7faef6b6-44c8-4251-981a-ca6f0eddeda1-logs\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.413987 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtzf9\" (UniqueName: \"kubernetes.io/projected/7faef6b6-44c8-4251-981a-ca6f0eddeda1-kube-api-access-mtzf9\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.414029 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7faef6b6-44c8-4251-981a-ca6f0eddeda1-config-data\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.414086 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.414094 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faef6b6-44c8-4251-981a-ca6f0eddeda1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.516009 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faef6b6-44c8-4251-981a-ca6f0eddeda1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.516271 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7faef6b6-44c8-4251-981a-ca6f0eddeda1-logs\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.516370 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtzf9\" (UniqueName: \"kubernetes.io/projected/7faef6b6-44c8-4251-981a-ca6f0eddeda1-kube-api-access-mtzf9\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.516490 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7faef6b6-44c8-4251-981a-ca6f0eddeda1-config-data\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.516638 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faef6b6-44c8-4251-981a-ca6f0eddeda1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.517737 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7faef6b6-44c8-4251-981a-ca6f0eddeda1-logs\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.520747 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7faef6b6-44c8-4251-981a-ca6f0eddeda1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.521955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7faef6b6-44c8-4251-981a-ca6f0eddeda1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.527155 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7faef6b6-44c8-4251-981a-ca6f0eddeda1-config-data\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.538035 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtzf9\" (UniqueName: \"kubernetes.io/projected/7faef6b6-44c8-4251-981a-ca6f0eddeda1-kube-api-access-mtzf9\") pod \"nova-metadata-0\" (UID: \"7faef6b6-44c8-4251-981a-ca6f0eddeda1\") " pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.716897 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 16:22:15 crc kubenswrapper[4764]: I1001 16:22:15.740028 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d905f1ae-9a08-4050-8661-c7069a8d8b83" path="/var/lib/kubelet/pods/d905f1ae-9a08-4050-8661-c7069a8d8b83/volumes" Oct 01 16:22:16 crc kubenswrapper[4764]: I1001 16:22:16.034797 4764 generic.go:334] "Generic (PLEG): container finished" podID="60ca659a-c997-4901-abd9-0200e6f16aea" containerID="5ae2bc29b554eff839b17a07ba1cc6a024038d7d8bf337f3c17debe4d17a2a8e" exitCode=0 Oct 01 16:22:16 crc kubenswrapper[4764]: I1001 16:22:16.034899 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"60ca659a-c997-4901-abd9-0200e6f16aea","Type":"ContainerDied","Data":"5ae2bc29b554eff839b17a07ba1cc6a024038d7d8bf337f3c17debe4d17a2a8e"} Oct 01 16:22:16 crc kubenswrapper[4764]: I1001 16:22:16.275010 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 16:22:16 crc kubenswrapper[4764]: W1001 16:22:16.286219 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7faef6b6_44c8_4251_981a_ca6f0eddeda1.slice/crio-ede74f40ed83fcd3c3297a3eb2cd9986b3de7c5cc00a000ff33bc3f874335b04 WatchSource:0}: Error finding container ede74f40ed83fcd3c3297a3eb2cd9986b3de7c5cc00a000ff33bc3f874335b04: Status 404 returned error can't find the container with id ede74f40ed83fcd3c3297a3eb2cd9986b3de7c5cc00a000ff33bc3f874335b04 Oct 01 16:22:16 crc kubenswrapper[4764]: I1001 16:22:16.632188 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 16:22:16 crc kubenswrapper[4764]: I1001 16:22:16.743881 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ca659a-c997-4901-abd9-0200e6f16aea-config-data\") pod \"60ca659a-c997-4901-abd9-0200e6f16aea\" (UID: \"60ca659a-c997-4901-abd9-0200e6f16aea\") " Oct 01 16:22:16 crc kubenswrapper[4764]: I1001 16:22:16.743933 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhj2z\" (UniqueName: \"kubernetes.io/projected/60ca659a-c997-4901-abd9-0200e6f16aea-kube-api-access-rhj2z\") pod \"60ca659a-c997-4901-abd9-0200e6f16aea\" (UID: \"60ca659a-c997-4901-abd9-0200e6f16aea\") " Oct 01 16:22:16 crc kubenswrapper[4764]: I1001 16:22:16.744136 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ca659a-c997-4901-abd9-0200e6f16aea-combined-ca-bundle\") pod \"60ca659a-c997-4901-abd9-0200e6f16aea\" (UID: \"60ca659a-c997-4901-abd9-0200e6f16aea\") " Oct 01 16:22:16 crc kubenswrapper[4764]: I1001 16:22:16.760505 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60ca659a-c997-4901-abd9-0200e6f16aea-kube-api-access-rhj2z" (OuterVolumeSpecName: "kube-api-access-rhj2z") pod "60ca659a-c997-4901-abd9-0200e6f16aea" (UID: "60ca659a-c997-4901-abd9-0200e6f16aea"). InnerVolumeSpecName "kube-api-access-rhj2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:22:16 crc kubenswrapper[4764]: I1001 16:22:16.776929 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ca659a-c997-4901-abd9-0200e6f16aea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60ca659a-c997-4901-abd9-0200e6f16aea" (UID: "60ca659a-c997-4901-abd9-0200e6f16aea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:16 crc kubenswrapper[4764]: I1001 16:22:16.796393 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ca659a-c997-4901-abd9-0200e6f16aea-config-data" (OuterVolumeSpecName: "config-data") pod "60ca659a-c997-4901-abd9-0200e6f16aea" (UID: "60ca659a-c997-4901-abd9-0200e6f16aea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:16 crc kubenswrapper[4764]: I1001 16:22:16.846370 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ca659a-c997-4901-abd9-0200e6f16aea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:16 crc kubenswrapper[4764]: I1001 16:22:16.846758 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60ca659a-c997-4901-abd9-0200e6f16aea-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:16 crc kubenswrapper[4764]: I1001 16:22:16.846771 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhj2z\" (UniqueName: \"kubernetes.io/projected/60ca659a-c997-4901-abd9-0200e6f16aea-kube-api-access-rhj2z\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.048367 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7faef6b6-44c8-4251-981a-ca6f0eddeda1","Type":"ContainerStarted","Data":"922a8fd1964086a401a1727adaf664b381b39fd3abd7e411c2b400926a114521"} Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.048433 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7faef6b6-44c8-4251-981a-ca6f0eddeda1","Type":"ContainerStarted","Data":"ede74f40ed83fcd3c3297a3eb2cd9986b3de7c5cc00a000ff33bc3f874335b04"} Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.051340 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"60ca659a-c997-4901-abd9-0200e6f16aea","Type":"ContainerDied","Data":"6c0aa38303daeaba412b20e6c0aa3ffdbb94c7828eb360072baf02eaaebfc106"} Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.051396 4764 scope.go:117] "RemoveContainer" containerID="5ae2bc29b554eff839b17a07ba1cc6a024038d7d8bf337f3c17debe4d17a2a8e" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.051416 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.089623 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.121813 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.138080 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:22:17 crc kubenswrapper[4764]: E1001 16:22:17.142281 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ca659a-c997-4901-abd9-0200e6f16aea" containerName="nova-scheduler-scheduler" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.142331 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ca659a-c997-4901-abd9-0200e6f16aea" containerName="nova-scheduler-scheduler" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.142707 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="60ca659a-c997-4901-abd9-0200e6f16aea" containerName="nova-scheduler-scheduler" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.143772 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.145792 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.153128 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e705f16-7e06-46aa-a290-f42760df1c2c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6e705f16-7e06-46aa-a290-f42760df1c2c\") " pod="openstack/nova-scheduler-0" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.153185 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zkd6\" (UniqueName: \"kubernetes.io/projected/6e705f16-7e06-46aa-a290-f42760df1c2c-kube-api-access-5zkd6\") pod \"nova-scheduler-0\" (UID: \"6e705f16-7e06-46aa-a290-f42760df1c2c\") " pod="openstack/nova-scheduler-0" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.153320 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e705f16-7e06-46aa-a290-f42760df1c2c-config-data\") pod \"nova-scheduler-0\" (UID: \"6e705f16-7e06-46aa-a290-f42760df1c2c\") " pod="openstack/nova-scheduler-0" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.154918 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.255126 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e705f16-7e06-46aa-a290-f42760df1c2c-config-data\") pod \"nova-scheduler-0\" (UID: \"6e705f16-7e06-46aa-a290-f42760df1c2c\") " pod="openstack/nova-scheduler-0" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.255499 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e705f16-7e06-46aa-a290-f42760df1c2c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6e705f16-7e06-46aa-a290-f42760df1c2c\") " pod="openstack/nova-scheduler-0" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.255675 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zkd6\" (UniqueName: \"kubernetes.io/projected/6e705f16-7e06-46aa-a290-f42760df1c2c-kube-api-access-5zkd6\") pod \"nova-scheduler-0\" (UID: \"6e705f16-7e06-46aa-a290-f42760df1c2c\") " pod="openstack/nova-scheduler-0" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.262823 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e705f16-7e06-46aa-a290-f42760df1c2c-config-data\") pod \"nova-scheduler-0\" (UID: \"6e705f16-7e06-46aa-a290-f42760df1c2c\") " pod="openstack/nova-scheduler-0" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.264745 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e705f16-7e06-46aa-a290-f42760df1c2c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6e705f16-7e06-46aa-a290-f42760df1c2c\") " pod="openstack/nova-scheduler-0" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.285696 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zkd6\" (UniqueName: \"kubernetes.io/projected/6e705f16-7e06-46aa-a290-f42760df1c2c-kube-api-access-5zkd6\") pod \"nova-scheduler-0\" (UID: \"6e705f16-7e06-46aa-a290-f42760df1c2c\") " pod="openstack/nova-scheduler-0" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.461484 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.735724 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60ca659a-c997-4901-abd9-0200e6f16aea" path="/var/lib/kubelet/pods/60ca659a-c997-4901-abd9-0200e6f16aea/volumes" Oct 01 16:22:17 crc kubenswrapper[4764]: I1001 16:22:17.941383 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 16:22:17 crc kubenswrapper[4764]: W1001 16:22:17.958913 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e705f16_7e06_46aa_a290_f42760df1c2c.slice/crio-c6c7c6b9f268882283af92485f248bf5a8e8653de8561c8c6011965936dd9b37 WatchSource:0}: Error finding container c6c7c6b9f268882283af92485f248bf5a8e8653de8561c8c6011965936dd9b37: Status 404 returned error can't find the container with id c6c7c6b9f268882283af92485f248bf5a8e8653de8561c8c6011965936dd9b37 Oct 01 16:22:18 crc kubenswrapper[4764]: I1001 16:22:18.068613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6e705f16-7e06-46aa-a290-f42760df1c2c","Type":"ContainerStarted","Data":"c6c7c6b9f268882283af92485f248bf5a8e8653de8561c8c6011965936dd9b37"} Oct 01 16:22:18 crc kubenswrapper[4764]: I1001 16:22:18.073105 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7faef6b6-44c8-4251-981a-ca6f0eddeda1","Type":"ContainerStarted","Data":"e9c8d99c9e8a2b8cb673f4cfcedc9ceaafe06c9aee02ec4baaf9f85e8e614cd1"} Oct 01 16:22:18 crc kubenswrapper[4764]: I1001 16:22:18.110989 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.110958249 podStartE2EDuration="3.110958249s" podCreationTimestamp="2025-10-01 16:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:22:18.094396471 +0000 UTC m=+1201.094043316" watchObservedRunningTime="2025-10-01 16:22:18.110958249 +0000 UTC m=+1201.110605114" Oct 01 16:22:19 crc kubenswrapper[4764]: I1001 16:22:19.101199 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6e705f16-7e06-46aa-a290-f42760df1c2c","Type":"ContainerStarted","Data":"f79036c4a4b6a4e0de3cdae350bb7b192c85f6cf895e15055f3ac3233dd3de42"} Oct 01 16:22:19 crc kubenswrapper[4764]: I1001 16:22:19.131426 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.131401757 podStartE2EDuration="2.131401757s" podCreationTimestamp="2025-10-01 16:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:22:19.121193205 +0000 UTC m=+1202.120840080" watchObservedRunningTime="2025-10-01 16:22:19.131401757 +0000 UTC m=+1202.131048632" Oct 01 16:22:20 crc kubenswrapper[4764]: I1001 16:22:20.717343 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 16:22:20 crc kubenswrapper[4764]: I1001 16:22:20.719142 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 16:22:21 crc kubenswrapper[4764]: I1001 16:22:21.913814 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:22:21 crc kubenswrapper[4764]: I1001 16:22:21.913888 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:22:22 crc kubenswrapper[4764]: I1001 16:22:22.462635 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 16:22:23 crc kubenswrapper[4764]: I1001 16:22:23.433170 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 16:22:23 crc kubenswrapper[4764]: I1001 16:22:23.433231 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 16:22:24 crc kubenswrapper[4764]: I1001 16:22:24.445417 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="445abcb4-96ed-403c-bf18-0c1bc5440182" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 16:22:24 crc kubenswrapper[4764]: I1001 16:22:24.445490 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="445abcb4-96ed-403c-bf18-0c1bc5440182" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.187:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 16:22:25 crc kubenswrapper[4764]: I1001 16:22:25.717872 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 16:22:25 crc kubenswrapper[4764]: I1001 16:22:25.718198 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 16:22:26 crc kubenswrapper[4764]: I1001 16:22:26.730258 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7faef6b6-44c8-4251-981a-ca6f0eddeda1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 16:22:26 crc kubenswrapper[4764]: I1001 16:22:26.730307 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7faef6b6-44c8-4251-981a-ca6f0eddeda1" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 16:22:27 crc kubenswrapper[4764]: I1001 16:22:27.462697 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 16:22:27 crc kubenswrapper[4764]: I1001 16:22:27.511241 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 16:22:28 crc kubenswrapper[4764]: I1001 16:22:28.233326 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 16:22:32 crc kubenswrapper[4764]: I1001 16:22:32.576771 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 16:22:33 crc kubenswrapper[4764]: I1001 16:22:33.440193 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 16:22:33 crc kubenswrapper[4764]: I1001 16:22:33.441808 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 16:22:33 crc kubenswrapper[4764]: I1001 16:22:33.442282 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 16:22:33 crc kubenswrapper[4764]: I1001 16:22:33.448380 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 16:22:34 crc kubenswrapper[4764]: I1001 16:22:34.269952 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 16:22:34 crc kubenswrapper[4764]: I1001 16:22:34.278726 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 16:22:35 crc kubenswrapper[4764]: I1001 16:22:35.738130 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 16:22:35 crc kubenswrapper[4764]: I1001 16:22:35.738234 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 16:22:35 crc kubenswrapper[4764]: I1001 16:22:35.752816 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 16:22:35 crc kubenswrapper[4764]: I1001 16:22:35.754799 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 16:22:43 crc kubenswrapper[4764]: I1001 16:22:43.914973 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 16:22:45 crc kubenswrapper[4764]: I1001 16:22:45.302464 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 16:22:47 crc kubenswrapper[4764]: I1001 16:22:47.820462 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="8bf385ea-f77a-4773-9c0c-e57f611707db" containerName="rabbitmq" containerID="cri-o://908bccfc60adc71acc7a3a4522bf29f9aab31c7e5ab0e8b886fb96ba8a0efd8d" gracePeriod=604797 Oct 01 16:22:49 crc kubenswrapper[4764]: I1001 16:22:49.309649 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="8bf385ea-f77a-4773-9c0c-e57f611707db" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Oct 01 16:22:49 crc kubenswrapper[4764]: I1001 16:22:49.653114 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="479f4015-9972-4350-bac3-6292b0c962ec" containerName="rabbitmq" containerID="cri-o://a093782315604cb982af0dc1d2bcb839738cf84bf6b0a72c1f2a8949785e648f" gracePeriod=604796 Oct 01 16:22:51 crc kubenswrapper[4764]: I1001 16:22:51.913995 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:22:51 crc kubenswrapper[4764]: I1001 16:22:51.914634 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.434681 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.472642 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-config-data\") pod \"8bf385ea-f77a-4773-9c0c-e57f611707db\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.472725 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-confd\") pod \"8bf385ea-f77a-4773-9c0c-e57f611707db\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.472753 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-plugins-conf\") pod \"8bf385ea-f77a-4773-9c0c-e57f611707db\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.472795 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8bf385ea-f77a-4773-9c0c-e57f611707db-erlang-cookie-secret\") pod \"8bf385ea-f77a-4773-9c0c-e57f611707db\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.472906 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-erlang-cookie\") pod \"8bf385ea-f77a-4773-9c0c-e57f611707db\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.472933 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkfzz\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-kube-api-access-gkfzz\") pod \"8bf385ea-f77a-4773-9c0c-e57f611707db\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.472960 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-plugins\") pod \"8bf385ea-f77a-4773-9c0c-e57f611707db\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.472991 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-server-conf\") pod \"8bf385ea-f77a-4773-9c0c-e57f611707db\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.473029 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-tls\") pod \"8bf385ea-f77a-4773-9c0c-e57f611707db\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.473246 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8bf385ea-f77a-4773-9c0c-e57f611707db-pod-info\") pod \"8bf385ea-f77a-4773-9c0c-e57f611707db\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.473277 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"8bf385ea-f77a-4773-9c0c-e57f611707db\" (UID: \"8bf385ea-f77a-4773-9c0c-e57f611707db\") " Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.474039 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8bf385ea-f77a-4773-9c0c-e57f611707db" (UID: "8bf385ea-f77a-4773-9c0c-e57f611707db"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.474776 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8bf385ea-f77a-4773-9c0c-e57f611707db" (UID: "8bf385ea-f77a-4773-9c0c-e57f611707db"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.474938 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8bf385ea-f77a-4773-9c0c-e57f611707db" (UID: "8bf385ea-f77a-4773-9c0c-e57f611707db"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.481765 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8bf385ea-f77a-4773-9c0c-e57f611707db" (UID: "8bf385ea-f77a-4773-9c0c-e57f611707db"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.483462 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "8bf385ea-f77a-4773-9c0c-e57f611707db" (UID: "8bf385ea-f77a-4773-9c0c-e57f611707db"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.490068 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8bf385ea-f77a-4773-9c0c-e57f611707db-pod-info" (OuterVolumeSpecName: "pod-info") pod "8bf385ea-f77a-4773-9c0c-e57f611707db" (UID: "8bf385ea-f77a-4773-9c0c-e57f611707db"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.496658 4764 generic.go:334] "Generic (PLEG): container finished" podID="8bf385ea-f77a-4773-9c0c-e57f611707db" containerID="908bccfc60adc71acc7a3a4522bf29f9aab31c7e5ab0e8b886fb96ba8a0efd8d" exitCode=0 Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.496752 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8bf385ea-f77a-4773-9c0c-e57f611707db","Type":"ContainerDied","Data":"908bccfc60adc71acc7a3a4522bf29f9aab31c7e5ab0e8b886fb96ba8a0efd8d"} Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.496820 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"8bf385ea-f77a-4773-9c0c-e57f611707db","Type":"ContainerDied","Data":"0a6dfc93a264ae5ec03e23412973b42d371d89d8f78983afdc3e5e56c0162193"} Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.496846 4764 scope.go:117] "RemoveContainer" containerID="908bccfc60adc71acc7a3a4522bf29f9aab31c7e5ab0e8b886fb96ba8a0efd8d" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.497227 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.497946 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bf385ea-f77a-4773-9c0c-e57f611707db-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8bf385ea-f77a-4773-9c0c-e57f611707db" (UID: "8bf385ea-f77a-4773-9c0c-e57f611707db"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.497955 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-kube-api-access-gkfzz" (OuterVolumeSpecName: "kube-api-access-gkfzz") pod "8bf385ea-f77a-4773-9c0c-e57f611707db" (UID: "8bf385ea-f77a-4773-9c0c-e57f611707db"). InnerVolumeSpecName "kube-api-access-gkfzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.538278 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-config-data" (OuterVolumeSpecName: "config-data") pod "8bf385ea-f77a-4773-9c0c-e57f611707db" (UID: "8bf385ea-f77a-4773-9c0c-e57f611707db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.577401 4764 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8bf385ea-f77a-4773-9c0c-e57f611707db-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.577451 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.577464 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.577475 4764 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.577489 4764 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8bf385ea-f77a-4773-9c0c-e57f611707db-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.577502 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.577514 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkfzz\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-kube-api-access-gkfzz\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.577526 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.577536 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.596312 4764 scope.go:117] "RemoveContainer" containerID="e646fa409a32957d53b50952baee94a73d06574d1a11a66aef2050c5c9358aa5" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.596521 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-server-conf" (OuterVolumeSpecName: "server-conf") pod "8bf385ea-f77a-4773-9c0c-e57f611707db" (UID: "8bf385ea-f77a-4773-9c0c-e57f611707db"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.599908 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.619447 4764 scope.go:117] "RemoveContainer" containerID="908bccfc60adc71acc7a3a4522bf29f9aab31c7e5ab0e8b886fb96ba8a0efd8d" Oct 01 16:22:54 crc kubenswrapper[4764]: E1001 16:22:54.619826 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"908bccfc60adc71acc7a3a4522bf29f9aab31c7e5ab0e8b886fb96ba8a0efd8d\": container with ID starting with 908bccfc60adc71acc7a3a4522bf29f9aab31c7e5ab0e8b886fb96ba8a0efd8d not found: ID does not exist" containerID="908bccfc60adc71acc7a3a4522bf29f9aab31c7e5ab0e8b886fb96ba8a0efd8d" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.619862 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"908bccfc60adc71acc7a3a4522bf29f9aab31c7e5ab0e8b886fb96ba8a0efd8d"} err="failed to get container status \"908bccfc60adc71acc7a3a4522bf29f9aab31c7e5ab0e8b886fb96ba8a0efd8d\": rpc error: code = NotFound desc = could not find container \"908bccfc60adc71acc7a3a4522bf29f9aab31c7e5ab0e8b886fb96ba8a0efd8d\": container with ID starting with 908bccfc60adc71acc7a3a4522bf29f9aab31c7e5ab0e8b886fb96ba8a0efd8d not found: ID does not exist" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.619887 4764 scope.go:117] "RemoveContainer" containerID="e646fa409a32957d53b50952baee94a73d06574d1a11a66aef2050c5c9358aa5" Oct 01 16:22:54 crc kubenswrapper[4764]: E1001 16:22:54.620213 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e646fa409a32957d53b50952baee94a73d06574d1a11a66aef2050c5c9358aa5\": container with ID starting with e646fa409a32957d53b50952baee94a73d06574d1a11a66aef2050c5c9358aa5 not found: ID does not exist" containerID="e646fa409a32957d53b50952baee94a73d06574d1a11a66aef2050c5c9358aa5" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.620241 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e646fa409a32957d53b50952baee94a73d06574d1a11a66aef2050c5c9358aa5"} err="failed to get container status \"e646fa409a32957d53b50952baee94a73d06574d1a11a66aef2050c5c9358aa5\": rpc error: code = NotFound desc = could not find container \"e646fa409a32957d53b50952baee94a73d06574d1a11a66aef2050c5c9358aa5\": container with ID starting with e646fa409a32957d53b50952baee94a73d06574d1a11a66aef2050c5c9358aa5 not found: ID does not exist" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.633168 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8bf385ea-f77a-4773-9c0c-e57f611707db" (UID: "8bf385ea-f77a-4773-9c0c-e57f611707db"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.679288 4764 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8bf385ea-f77a-4773-9c0c-e57f611707db-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.679325 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.679333 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8bf385ea-f77a-4773-9c0c-e57f611707db-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.873999 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.882659 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.897791 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 16:22:54 crc kubenswrapper[4764]: E1001 16:22:54.898241 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf385ea-f77a-4773-9c0c-e57f611707db" containerName="setup-container" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.898261 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf385ea-f77a-4773-9c0c-e57f611707db" containerName="setup-container" Oct 01 16:22:54 crc kubenswrapper[4764]: E1001 16:22:54.898284 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf385ea-f77a-4773-9c0c-e57f611707db" containerName="rabbitmq" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.898291 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf385ea-f77a-4773-9c0c-e57f611707db" containerName="rabbitmq" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.898509 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf385ea-f77a-4773-9c0c-e57f611707db" containerName="rabbitmq" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.899610 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.905653 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.905761 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.905663 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.905895 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.905999 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5v9t9" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.906107 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.906163 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.913773 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.984417 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17487462-b952-4428-a875-61732b895017-server-conf\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.984468 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrb8x\" (UniqueName: \"kubernetes.io/projected/17487462-b952-4428-a875-61732b895017-kube-api-access-zrb8x\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.984521 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17487462-b952-4428-a875-61732b895017-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.984543 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17487462-b952-4428-a875-61732b895017-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.984594 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17487462-b952-4428-a875-61732b895017-pod-info\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.984645 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17487462-b952-4428-a875-61732b895017-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.984695 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17487462-b952-4428-a875-61732b895017-config-data\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.984726 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17487462-b952-4428-a875-61732b895017-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.984766 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17487462-b952-4428-a875-61732b895017-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.984783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17487462-b952-4428-a875-61732b895017-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:54 crc kubenswrapper[4764]: I1001 16:22:54.984803 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.091937 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17487462-b952-4428-a875-61732b895017-config-data\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.091988 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17487462-b952-4428-a875-61732b895017-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.092034 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17487462-b952-4428-a875-61732b895017-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.092101 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17487462-b952-4428-a875-61732b895017-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.092126 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.092152 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17487462-b952-4428-a875-61732b895017-server-conf\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.092182 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrb8x\" (UniqueName: \"kubernetes.io/projected/17487462-b952-4428-a875-61732b895017-kube-api-access-zrb8x\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.092227 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17487462-b952-4428-a875-61732b895017-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.092242 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17487462-b952-4428-a875-61732b895017-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.092262 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17487462-b952-4428-a875-61732b895017-pod-info\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.092304 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17487462-b952-4428-a875-61732b895017-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.093024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17487462-b952-4428-a875-61732b895017-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.093289 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17487462-b952-4428-a875-61732b895017-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.093349 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17487462-b952-4428-a875-61732b895017-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.093492 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.093939 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17487462-b952-4428-a875-61732b895017-server-conf\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.093950 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17487462-b952-4428-a875-61732b895017-config-data\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.097344 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17487462-b952-4428-a875-61732b895017-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.101796 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17487462-b952-4428-a875-61732b895017-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.103647 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17487462-b952-4428-a875-61732b895017-pod-info\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.104208 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17487462-b952-4428-a875-61732b895017-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.117811 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrb8x\" (UniqueName: \"kubernetes.io/projected/17487462-b952-4428-a875-61732b895017-kube-api-access-zrb8x\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.131512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"17487462-b952-4428-a875-61732b895017\") " pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.233773 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 16:22:55 crc kubenswrapper[4764]: W1001 16:22:55.733263 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17487462_b952_4428_a875_61732b895017.slice/crio-ee7dead2e72e817ec322dfce8f506c70c27df5d691c1fe2791bd4734b6f38142 WatchSource:0}: Error finding container ee7dead2e72e817ec322dfce8f506c70c27df5d691c1fe2791bd4734b6f38142: Status 404 returned error can't find the container with id ee7dead2e72e817ec322dfce8f506c70c27df5d691c1fe2791bd4734b6f38142 Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.738328 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bf385ea-f77a-4773-9c0c-e57f611707db" path="/var/lib/kubelet/pods/8bf385ea-f77a-4773-9c0c-e57f611707db/volumes" Oct 01 16:22:55 crc kubenswrapper[4764]: I1001 16:22:55.739058 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.240055 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.312752 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"479f4015-9972-4350-bac3-6292b0c962ec\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.312805 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-erlang-cookie\") pod \"479f4015-9972-4350-bac3-6292b0c962ec\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.312849 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-plugins-conf\") pod \"479f4015-9972-4350-bac3-6292b0c962ec\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.312870 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-confd\") pod \"479f4015-9972-4350-bac3-6292b0c962ec\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.312884 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-plugins\") pod \"479f4015-9972-4350-bac3-6292b0c962ec\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.312924 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-config-data\") pod \"479f4015-9972-4350-bac3-6292b0c962ec\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.312986 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/479f4015-9972-4350-bac3-6292b0c962ec-pod-info\") pod \"479f4015-9972-4350-bac3-6292b0c962ec\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.313035 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-tls\") pod \"479f4015-9972-4350-bac3-6292b0c962ec\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.313102 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/479f4015-9972-4350-bac3-6292b0c962ec-erlang-cookie-secret\") pod \"479f4015-9972-4350-bac3-6292b0c962ec\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.313139 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-server-conf\") pod \"479f4015-9972-4350-bac3-6292b0c962ec\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.313212 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq9ph\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-kube-api-access-fq9ph\") pod \"479f4015-9972-4350-bac3-6292b0c962ec\" (UID: \"479f4015-9972-4350-bac3-6292b0c962ec\") " Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.313844 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "479f4015-9972-4350-bac3-6292b0c962ec" (UID: "479f4015-9972-4350-bac3-6292b0c962ec"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.317852 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-kube-api-access-fq9ph" (OuterVolumeSpecName: "kube-api-access-fq9ph") pod "479f4015-9972-4350-bac3-6292b0c962ec" (UID: "479f4015-9972-4350-bac3-6292b0c962ec"). InnerVolumeSpecName "kube-api-access-fq9ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.317937 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/479f4015-9972-4350-bac3-6292b0c962ec-pod-info" (OuterVolumeSpecName: "pod-info") pod "479f4015-9972-4350-bac3-6292b0c962ec" (UID: "479f4015-9972-4350-bac3-6292b0c962ec"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.318316 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "479f4015-9972-4350-bac3-6292b0c962ec" (UID: "479f4015-9972-4350-bac3-6292b0c962ec"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.318853 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "479f4015-9972-4350-bac3-6292b0c962ec" (UID: "479f4015-9972-4350-bac3-6292b0c962ec"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.318872 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "479f4015-9972-4350-bac3-6292b0c962ec" (UID: "479f4015-9972-4350-bac3-6292b0c962ec"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.334901 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "479f4015-9972-4350-bac3-6292b0c962ec" (UID: "479f4015-9972-4350-bac3-6292b0c962ec"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.362677 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479f4015-9972-4350-bac3-6292b0c962ec-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "479f4015-9972-4350-bac3-6292b0c962ec" (UID: "479f4015-9972-4350-bac3-6292b0c962ec"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.365963 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-config-data" (OuterVolumeSpecName: "config-data") pod "479f4015-9972-4350-bac3-6292b0c962ec" (UID: "479f4015-9972-4350-bac3-6292b0c962ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.384319 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-server-conf" (OuterVolumeSpecName: "server-conf") pod "479f4015-9972-4350-bac3-6292b0c962ec" (UID: "479f4015-9972-4350-bac3-6292b0c962ec"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.417157 4764 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/479f4015-9972-4350-bac3-6292b0c962ec-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.417196 4764 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.417206 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq9ph\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-kube-api-access-fq9ph\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.417230 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.417241 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.417251 4764 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.417259 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.417266 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/479f4015-9972-4350-bac3-6292b0c962ec-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.417274 4764 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/479f4015-9972-4350-bac3-6292b0c962ec-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.417282 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.439846 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.466322 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "479f4015-9972-4350-bac3-6292b0c962ec" (UID: "479f4015-9972-4350-bac3-6292b0c962ec"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.518842 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.518872 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/479f4015-9972-4350-bac3-6292b0c962ec-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.527852 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17487462-b952-4428-a875-61732b895017","Type":"ContainerStarted","Data":"ee7dead2e72e817ec322dfce8f506c70c27df5d691c1fe2791bd4734b6f38142"} Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.530037 4764 generic.go:334] "Generic (PLEG): container finished" podID="479f4015-9972-4350-bac3-6292b0c962ec" containerID="a093782315604cb982af0dc1d2bcb839738cf84bf6b0a72c1f2a8949785e648f" exitCode=0 Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.530097 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"479f4015-9972-4350-bac3-6292b0c962ec","Type":"ContainerDied","Data":"a093782315604cb982af0dc1d2bcb839738cf84bf6b0a72c1f2a8949785e648f"} Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.530146 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"479f4015-9972-4350-bac3-6292b0c962ec","Type":"ContainerDied","Data":"2162ff10b4ebcba6b6f0773dc5dd0ec1a6384c8fe386119983bc7779045a2dcd"} Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.530163 4764 scope.go:117] "RemoveContainer" containerID="a093782315604cb982af0dc1d2bcb839738cf84bf6b0a72c1f2a8949785e648f" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.530282 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.575611 4764 scope.go:117] "RemoveContainer" containerID="52ac9acba897b8e9057364607d74dceba80df0919088af0a8b45b103e07de86e" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.586207 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.605253 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.618742 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 16:22:56 crc kubenswrapper[4764]: E1001 16:22:56.619141 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479f4015-9972-4350-bac3-6292b0c962ec" containerName="setup-container" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.619153 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="479f4015-9972-4350-bac3-6292b0c962ec" containerName="setup-container" Oct 01 16:22:56 crc kubenswrapper[4764]: E1001 16:22:56.619175 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479f4015-9972-4350-bac3-6292b0c962ec" containerName="rabbitmq" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.619180 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="479f4015-9972-4350-bac3-6292b0c962ec" containerName="rabbitmq" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.619334 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="479f4015-9972-4350-bac3-6292b0c962ec" containerName="rabbitmq" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.620472 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.622417 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.622883 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xv7cg" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.622891 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.623013 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.624617 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.624879 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.628895 4764 scope.go:117] "RemoveContainer" containerID="a093782315604cb982af0dc1d2bcb839738cf84bf6b0a72c1f2a8949785e648f" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.629123 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.635630 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 16:22:56 crc kubenswrapper[4764]: E1001 16:22:56.640951 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a093782315604cb982af0dc1d2bcb839738cf84bf6b0a72c1f2a8949785e648f\": container with ID starting with a093782315604cb982af0dc1d2bcb839738cf84bf6b0a72c1f2a8949785e648f not found: ID does not exist" containerID="a093782315604cb982af0dc1d2bcb839738cf84bf6b0a72c1f2a8949785e648f" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.641001 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a093782315604cb982af0dc1d2bcb839738cf84bf6b0a72c1f2a8949785e648f"} err="failed to get container status \"a093782315604cb982af0dc1d2bcb839738cf84bf6b0a72c1f2a8949785e648f\": rpc error: code = NotFound desc = could not find container \"a093782315604cb982af0dc1d2bcb839738cf84bf6b0a72c1f2a8949785e648f\": container with ID starting with a093782315604cb982af0dc1d2bcb839738cf84bf6b0a72c1f2a8949785e648f not found: ID does not exist" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.641031 4764 scope.go:117] "RemoveContainer" containerID="52ac9acba897b8e9057364607d74dceba80df0919088af0a8b45b103e07de86e" Oct 01 16:22:56 crc kubenswrapper[4764]: E1001 16:22:56.642634 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ac9acba897b8e9057364607d74dceba80df0919088af0a8b45b103e07de86e\": container with ID starting with 52ac9acba897b8e9057364607d74dceba80df0919088af0a8b45b103e07de86e not found: ID does not exist" containerID="52ac9acba897b8e9057364607d74dceba80df0919088af0a8b45b103e07de86e" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.642667 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ac9acba897b8e9057364607d74dceba80df0919088af0a8b45b103e07de86e"} err="failed to get container status \"52ac9acba897b8e9057364607d74dceba80df0919088af0a8b45b103e07de86e\": rpc error: code = NotFound desc = could not find container \"52ac9acba897b8e9057364607d74dceba80df0919088af0a8b45b103e07de86e\": container with ID starting with 52ac9acba897b8e9057364607d74dceba80df0919088af0a8b45b103e07de86e not found: ID does not exist" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.721811 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e6c99317-e5aa-4c87-a45a-34e4d14846e4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.721847 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw47r\" (UniqueName: \"kubernetes.io/projected/e6c99317-e5aa-4c87-a45a-34e4d14846e4-kube-api-access-fw47r\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.721868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e6c99317-e5aa-4c87-a45a-34e4d14846e4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.721892 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e6c99317-e5aa-4c87-a45a-34e4d14846e4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.722334 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e6c99317-e5aa-4c87-a45a-34e4d14846e4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.722373 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e6c99317-e5aa-4c87-a45a-34e4d14846e4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.722443 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6c99317-e5aa-4c87-a45a-34e4d14846e4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.722489 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e6c99317-e5aa-4c87-a45a-34e4d14846e4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.722529 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e6c99317-e5aa-4c87-a45a-34e4d14846e4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.722576 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.722622 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e6c99317-e5aa-4c87-a45a-34e4d14846e4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.824264 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e6c99317-e5aa-4c87-a45a-34e4d14846e4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.824355 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e6c99317-e5aa-4c87-a45a-34e4d14846e4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.824383 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw47r\" (UniqueName: \"kubernetes.io/projected/e6c99317-e5aa-4c87-a45a-34e4d14846e4-kube-api-access-fw47r\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.824401 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e6c99317-e5aa-4c87-a45a-34e4d14846e4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.824423 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e6c99317-e5aa-4c87-a45a-34e4d14846e4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.824462 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e6c99317-e5aa-4c87-a45a-34e4d14846e4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.824495 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e6c99317-e5aa-4c87-a45a-34e4d14846e4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.824520 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6c99317-e5aa-4c87-a45a-34e4d14846e4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.824534 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e6c99317-e5aa-4c87-a45a-34e4d14846e4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.824563 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e6c99317-e5aa-4c87-a45a-34e4d14846e4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.824595 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.824834 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.825167 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e6c99317-e5aa-4c87-a45a-34e4d14846e4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.826208 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e6c99317-e5aa-4c87-a45a-34e4d14846e4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.826694 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6c99317-e5aa-4c87-a45a-34e4d14846e4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.826733 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e6c99317-e5aa-4c87-a45a-34e4d14846e4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.826905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e6c99317-e5aa-4c87-a45a-34e4d14846e4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.875394 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e6c99317-e5aa-4c87-a45a-34e4d14846e4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.875698 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e6c99317-e5aa-4c87-a45a-34e4d14846e4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.877033 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e6c99317-e5aa-4c87-a45a-34e4d14846e4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.877785 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw47r\" (UniqueName: \"kubernetes.io/projected/e6c99317-e5aa-4c87-a45a-34e4d14846e4-kube-api-access-fw47r\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.883320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e6c99317-e5aa-4c87-a45a-34e4d14846e4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.903626 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"e6c99317-e5aa-4c87-a45a-34e4d14846e4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:56 crc kubenswrapper[4764]: I1001 16:22:56.942998 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:22:57 crc kubenswrapper[4764]: I1001 16:22:57.437189 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 16:22:57 crc kubenswrapper[4764]: I1001 16:22:57.539568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e6c99317-e5aa-4c87-a45a-34e4d14846e4","Type":"ContainerStarted","Data":"df457e2a216b792e715589f287d4d73fc5460564ca82427b35f03bd650b0d39e"} Oct 01 16:22:57 crc kubenswrapper[4764]: I1001 16:22:57.541192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17487462-b952-4428-a875-61732b895017","Type":"ContainerStarted","Data":"9f0bf694dc65bd06dbb18c2c5aae8ff5428f096ef978c1bc195102be83784a3b"} Oct 01 16:22:57 crc kubenswrapper[4764]: I1001 16:22:57.768676 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479f4015-9972-4350-bac3-6292b0c962ec" path="/var/lib/kubelet/pods/479f4015-9972-4350-bac3-6292b0c962ec/volumes" Oct 01 16:22:59 crc kubenswrapper[4764]: I1001 16:22:59.566586 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e6c99317-e5aa-4c87-a45a-34e4d14846e4","Type":"ContainerStarted","Data":"ac5bfe6e20abbb15dee31b81bf0b274566ec0a72792433ef17e85cd583e19cca"} Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.635773 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-z25nx"] Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.637479 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.641827 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.679860 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-z25nx"] Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.807932 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.808023 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-config\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.808335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.808443 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.808558 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kkgr\" (UniqueName: \"kubernetes.io/projected/18119c25-297e-4101-855c-1d79282923e7-kube-api-access-8kkgr\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.808603 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.909969 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.910149 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.910219 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kkgr\" (UniqueName: \"kubernetes.io/projected/18119c25-297e-4101-855c-1d79282923e7-kube-api-access-8kkgr\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.910261 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.910283 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.910336 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-config\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.911358 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.911367 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.911538 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.911846 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.912253 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-config\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.932354 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kkgr\" (UniqueName: \"kubernetes.io/projected/18119c25-297e-4101-855c-1d79282923e7-kube-api-access-8kkgr\") pod \"dnsmasq-dns-6447ccbd8f-z25nx\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:00 crc kubenswrapper[4764]: I1001 16:23:00.985120 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:01 crc kubenswrapper[4764]: I1001 16:23:01.297601 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-z25nx"] Oct 01 16:23:01 crc kubenswrapper[4764]: W1001 16:23:01.304553 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18119c25_297e_4101_855c_1d79282923e7.slice/crio-36e9730fcd4cc838ca2cdf97e6565bdaba0ae24c2672df19b69301232d4e3f13 WatchSource:0}: Error finding container 36e9730fcd4cc838ca2cdf97e6565bdaba0ae24c2672df19b69301232d4e3f13: Status 404 returned error can't find the container with id 36e9730fcd4cc838ca2cdf97e6565bdaba0ae24c2672df19b69301232d4e3f13 Oct 01 16:23:01 crc kubenswrapper[4764]: I1001 16:23:01.588569 4764 generic.go:334] "Generic (PLEG): container finished" podID="18119c25-297e-4101-855c-1d79282923e7" containerID="0c2cafae36d06d5b16092f588a287b75fbaee2067cdecbe243104d580266304d" exitCode=0 Oct 01 16:23:01 crc kubenswrapper[4764]: I1001 16:23:01.588649 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" event={"ID":"18119c25-297e-4101-855c-1d79282923e7","Type":"ContainerDied","Data":"0c2cafae36d06d5b16092f588a287b75fbaee2067cdecbe243104d580266304d"} Oct 01 16:23:01 crc kubenswrapper[4764]: I1001 16:23:01.588689 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" event={"ID":"18119c25-297e-4101-855c-1d79282923e7","Type":"ContainerStarted","Data":"36e9730fcd4cc838ca2cdf97e6565bdaba0ae24c2672df19b69301232d4e3f13"} Oct 01 16:23:02 crc kubenswrapper[4764]: I1001 16:23:02.600243 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" event={"ID":"18119c25-297e-4101-855c-1d79282923e7","Type":"ContainerStarted","Data":"d395865b3fb206a1da562bac09ed01816d900ab627ea16c5ae5936a7679ba16a"} Oct 01 16:23:02 crc kubenswrapper[4764]: I1001 16:23:02.600593 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:02 crc kubenswrapper[4764]: I1001 16:23:02.627845 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" podStartSLOduration=2.6278265149999998 podStartE2EDuration="2.627826515s" podCreationTimestamp="2025-10-01 16:23:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:23:02.622869944 +0000 UTC m=+1245.622516799" watchObservedRunningTime="2025-10-01 16:23:02.627826515 +0000 UTC m=+1245.627473360" Oct 01 16:23:10 crc kubenswrapper[4764]: I1001 16:23:10.987220 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.075297 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-mmg6q"] Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.075899 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" podUID="4a2a8617-a485-4228-94ff-874d395bc9a8" containerName="dnsmasq-dns" containerID="cri-o://cb51c2b6fc6d3054fafd6f37dd9138053b9fbd9ac52acdd735e5d08c6a8565dd" gracePeriod=10 Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.266175 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-f7bn8"] Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.267655 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.280910 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-f7bn8"] Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.431503 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.431836 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.431886 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.431911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-config\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.432212 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.432690 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m957\" (UniqueName: \"kubernetes.io/projected/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-kube-api-access-8m957\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.535104 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m957\" (UniqueName: \"kubernetes.io/projected/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-kube-api-access-8m957\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.535234 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.535263 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.535317 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.535344 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-config\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.535405 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.536202 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.536216 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.536489 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.536817 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.537111 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-config\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.558117 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m957\" (UniqueName: \"kubernetes.io/projected/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-kube-api-access-8m957\") pod \"dnsmasq-dns-864d5fc68c-f7bn8\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.591865 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.653920 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.722405 4764 generic.go:334] "Generic (PLEG): container finished" podID="4a2a8617-a485-4228-94ff-874d395bc9a8" containerID="cb51c2b6fc6d3054fafd6f37dd9138053b9fbd9ac52acdd735e5d08c6a8565dd" exitCode=0 Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.722526 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.739424 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" event={"ID":"4a2a8617-a485-4228-94ff-874d395bc9a8","Type":"ContainerDied","Data":"cb51c2b6fc6d3054fafd6f37dd9138053b9fbd9ac52acdd735e5d08c6a8565dd"} Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.739476 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-mmg6q" event={"ID":"4a2a8617-a485-4228-94ff-874d395bc9a8","Type":"ContainerDied","Data":"d7d24143e3b9015fa62edb50a7f6c610f96fab99879b32fb52769f07653b5696"} Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.739502 4764 scope.go:117] "RemoveContainer" containerID="cb51c2b6fc6d3054fafd6f37dd9138053b9fbd9ac52acdd735e5d08c6a8565dd" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.765823 4764 scope.go:117] "RemoveContainer" containerID="0225d91c2a2841e1744ca00766f8654e8fe4af4b16d139db76228ddaf04a72a4" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.790286 4764 scope.go:117] "RemoveContainer" containerID="cb51c2b6fc6d3054fafd6f37dd9138053b9fbd9ac52acdd735e5d08c6a8565dd" Oct 01 16:23:11 crc kubenswrapper[4764]: E1001 16:23:11.792543 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb51c2b6fc6d3054fafd6f37dd9138053b9fbd9ac52acdd735e5d08c6a8565dd\": container with ID starting with cb51c2b6fc6d3054fafd6f37dd9138053b9fbd9ac52acdd735e5d08c6a8565dd not found: ID does not exist" containerID="cb51c2b6fc6d3054fafd6f37dd9138053b9fbd9ac52acdd735e5d08c6a8565dd" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.792593 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb51c2b6fc6d3054fafd6f37dd9138053b9fbd9ac52acdd735e5d08c6a8565dd"} err="failed to get container status \"cb51c2b6fc6d3054fafd6f37dd9138053b9fbd9ac52acdd735e5d08c6a8565dd\": rpc error: code = NotFound desc = could not find container \"cb51c2b6fc6d3054fafd6f37dd9138053b9fbd9ac52acdd735e5d08c6a8565dd\": container with ID starting with cb51c2b6fc6d3054fafd6f37dd9138053b9fbd9ac52acdd735e5d08c6a8565dd not found: ID does not exist" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.792647 4764 scope.go:117] "RemoveContainer" containerID="0225d91c2a2841e1744ca00766f8654e8fe4af4b16d139db76228ddaf04a72a4" Oct 01 16:23:11 crc kubenswrapper[4764]: E1001 16:23:11.793501 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0225d91c2a2841e1744ca00766f8654e8fe4af4b16d139db76228ddaf04a72a4\": container with ID starting with 0225d91c2a2841e1744ca00766f8654e8fe4af4b16d139db76228ddaf04a72a4 not found: ID does not exist" containerID="0225d91c2a2841e1744ca00766f8654e8fe4af4b16d139db76228ddaf04a72a4" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.793556 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0225d91c2a2841e1744ca00766f8654e8fe4af4b16d139db76228ddaf04a72a4"} err="failed to get container status \"0225d91c2a2841e1744ca00766f8654e8fe4af4b16d139db76228ddaf04a72a4\": rpc error: code = NotFound desc = could not find container \"0225d91c2a2841e1744ca00766f8654e8fe4af4b16d139db76228ddaf04a72a4\": container with ID starting with 0225d91c2a2841e1744ca00766f8654e8fe4af4b16d139db76228ddaf04a72a4 not found: ID does not exist" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.841376 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86r8t\" (UniqueName: \"kubernetes.io/projected/4a2a8617-a485-4228-94ff-874d395bc9a8-kube-api-access-86r8t\") pod \"4a2a8617-a485-4228-94ff-874d395bc9a8\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.841730 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-ovsdbserver-nb\") pod \"4a2a8617-a485-4228-94ff-874d395bc9a8\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.841949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-config\") pod \"4a2a8617-a485-4228-94ff-874d395bc9a8\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.842248 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-ovsdbserver-sb\") pod \"4a2a8617-a485-4228-94ff-874d395bc9a8\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.843290 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-dns-svc\") pod \"4a2a8617-a485-4228-94ff-874d395bc9a8\" (UID: \"4a2a8617-a485-4228-94ff-874d395bc9a8\") " Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.847829 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2a8617-a485-4228-94ff-874d395bc9a8-kube-api-access-86r8t" (OuterVolumeSpecName: "kube-api-access-86r8t") pod "4a2a8617-a485-4228-94ff-874d395bc9a8" (UID: "4a2a8617-a485-4228-94ff-874d395bc9a8"). InnerVolumeSpecName "kube-api-access-86r8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.891640 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a2a8617-a485-4228-94ff-874d395bc9a8" (UID: "4a2a8617-a485-4228-94ff-874d395bc9a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.897772 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a2a8617-a485-4228-94ff-874d395bc9a8" (UID: "4a2a8617-a485-4228-94ff-874d395bc9a8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.905915 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-config" (OuterVolumeSpecName: "config") pod "4a2a8617-a485-4228-94ff-874d395bc9a8" (UID: "4a2a8617-a485-4228-94ff-874d395bc9a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.914603 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a2a8617-a485-4228-94ff-874d395bc9a8" (UID: "4a2a8617-a485-4228-94ff-874d395bc9a8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.945613 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.945641 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.945651 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86r8t\" (UniqueName: \"kubernetes.io/projected/4a2a8617-a485-4228-94ff-874d395bc9a8-kube-api-access-86r8t\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.945675 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:11 crc kubenswrapper[4764]: I1001 16:23:11.945685 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a2a8617-a485-4228-94ff-874d395bc9a8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:12 crc kubenswrapper[4764]: I1001 16:23:12.057818 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-mmg6q"] Oct 01 16:23:12 crc kubenswrapper[4764]: I1001 16:23:12.065370 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-mmg6q"] Oct 01 16:23:12 crc kubenswrapper[4764]: I1001 16:23:12.079862 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-f7bn8"] Oct 01 16:23:12 crc kubenswrapper[4764]: W1001 16:23:12.084098 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff2f77d9_2e42_4764_b0b5_6a4b13877ed4.slice/crio-fab9eeccb127c2f580a085ae7da46c987a85005632ecd55d25908a3be04e5bac WatchSource:0}: Error finding container fab9eeccb127c2f580a085ae7da46c987a85005632ecd55d25908a3be04e5bac: Status 404 returned error can't find the container with id fab9eeccb127c2f580a085ae7da46c987a85005632ecd55d25908a3be04e5bac Oct 01 16:23:12 crc kubenswrapper[4764]: I1001 16:23:12.735894 4764 generic.go:334] "Generic (PLEG): container finished" podID="ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" containerID="cb5432978d44c0c2e729353a7af8c1642f9976509ecca80bd3dbe2945ec85b3e" exitCode=0 Oct 01 16:23:12 crc kubenswrapper[4764]: I1001 16:23:12.735991 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" event={"ID":"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4","Type":"ContainerDied","Data":"cb5432978d44c0c2e729353a7af8c1642f9976509ecca80bd3dbe2945ec85b3e"} Oct 01 16:23:12 crc kubenswrapper[4764]: I1001 16:23:12.736451 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" event={"ID":"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4","Type":"ContainerStarted","Data":"fab9eeccb127c2f580a085ae7da46c987a85005632ecd55d25908a3be04e5bac"} Oct 01 16:23:13 crc kubenswrapper[4764]: I1001 16:23:13.742316 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2a8617-a485-4228-94ff-874d395bc9a8" path="/var/lib/kubelet/pods/4a2a8617-a485-4228-94ff-874d395bc9a8/volumes" Oct 01 16:23:13 crc kubenswrapper[4764]: I1001 16:23:13.758209 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" event={"ID":"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4","Type":"ContainerStarted","Data":"f40c13c0ad3da5ee48ba730bc10a564b9bfef1fa53bd6b296f3cad125a7a5854"} Oct 01 16:23:13 crc kubenswrapper[4764]: I1001 16:23:13.758645 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:13 crc kubenswrapper[4764]: I1001 16:23:13.792083 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" podStartSLOduration=2.792036398 podStartE2EDuration="2.792036398s" podCreationTimestamp="2025-10-01 16:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:23:13.786882161 +0000 UTC m=+1256.786529026" watchObservedRunningTime="2025-10-01 16:23:13.792036398 +0000 UTC m=+1256.791683233" Oct 01 16:23:21 crc kubenswrapper[4764]: I1001 16:23:21.594847 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:23:21 crc kubenswrapper[4764]: I1001 16:23:21.704878 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-z25nx"] Oct 01 16:23:21 crc kubenswrapper[4764]: I1001 16:23:21.705199 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" podUID="18119c25-297e-4101-855c-1d79282923e7" containerName="dnsmasq-dns" containerID="cri-o://d395865b3fb206a1da562bac09ed01816d900ab627ea16c5ae5936a7679ba16a" gracePeriod=10 Oct 01 16:23:21 crc kubenswrapper[4764]: I1001 16:23:21.843676 4764 generic.go:334] "Generic (PLEG): container finished" podID="18119c25-297e-4101-855c-1d79282923e7" containerID="d395865b3fb206a1da562bac09ed01816d900ab627ea16c5ae5936a7679ba16a" exitCode=0 Oct 01 16:23:21 crc kubenswrapper[4764]: I1001 16:23:21.843723 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" event={"ID":"18119c25-297e-4101-855c-1d79282923e7","Type":"ContainerDied","Data":"d395865b3fb206a1da562bac09ed01816d900ab627ea16c5ae5936a7679ba16a"} Oct 01 16:23:21 crc kubenswrapper[4764]: I1001 16:23:21.914454 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:23:21 crc kubenswrapper[4764]: I1001 16:23:21.914508 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:23:21 crc kubenswrapper[4764]: I1001 16:23:21.914547 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:23:21 crc kubenswrapper[4764]: I1001 16:23:21.915245 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98c11f42deb2a855802db6e539c07b78ed64042cc307603fe80868f20ffd6d4f"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:23:21 crc kubenswrapper[4764]: I1001 16:23:21.915301 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://98c11f42deb2a855802db6e539c07b78ed64042cc307603fe80868f20ffd6d4f" gracePeriod=600 Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.220776 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.360489 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-ovsdbserver-sb\") pod \"18119c25-297e-4101-855c-1d79282923e7\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.360531 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kkgr\" (UniqueName: \"kubernetes.io/projected/18119c25-297e-4101-855c-1d79282923e7-kube-api-access-8kkgr\") pod \"18119c25-297e-4101-855c-1d79282923e7\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.360560 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-openstack-edpm-ipam\") pod \"18119c25-297e-4101-855c-1d79282923e7\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.360605 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-dns-svc\") pod \"18119c25-297e-4101-855c-1d79282923e7\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.360628 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-config\") pod \"18119c25-297e-4101-855c-1d79282923e7\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.360662 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-ovsdbserver-nb\") pod \"18119c25-297e-4101-855c-1d79282923e7\" (UID: \"18119c25-297e-4101-855c-1d79282923e7\") " Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.367711 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18119c25-297e-4101-855c-1d79282923e7-kube-api-access-8kkgr" (OuterVolumeSpecName: "kube-api-access-8kkgr") pod "18119c25-297e-4101-855c-1d79282923e7" (UID: "18119c25-297e-4101-855c-1d79282923e7"). InnerVolumeSpecName "kube-api-access-8kkgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.406850 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18119c25-297e-4101-855c-1d79282923e7" (UID: "18119c25-297e-4101-855c-1d79282923e7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.412816 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-config" (OuterVolumeSpecName: "config") pod "18119c25-297e-4101-855c-1d79282923e7" (UID: "18119c25-297e-4101-855c-1d79282923e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.413285 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "18119c25-297e-4101-855c-1d79282923e7" (UID: "18119c25-297e-4101-855c-1d79282923e7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.420887 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "18119c25-297e-4101-855c-1d79282923e7" (UID: "18119c25-297e-4101-855c-1d79282923e7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.438384 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "18119c25-297e-4101-855c-1d79282923e7" (UID: "18119c25-297e-4101-855c-1d79282923e7"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.463303 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.463362 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kkgr\" (UniqueName: \"kubernetes.io/projected/18119c25-297e-4101-855c-1d79282923e7-kube-api-access-8kkgr\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.463382 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.463400 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.463418 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.463433 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18119c25-297e-4101-855c-1d79282923e7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.902899 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" event={"ID":"18119c25-297e-4101-855c-1d79282923e7","Type":"ContainerDied","Data":"36e9730fcd4cc838ca2cdf97e6565bdaba0ae24c2672df19b69301232d4e3f13"} Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.902975 4764 scope.go:117] "RemoveContainer" containerID="d395865b3fb206a1da562bac09ed01816d900ab627ea16c5ae5936a7679ba16a" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.903193 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-z25nx" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.907272 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="98c11f42deb2a855802db6e539c07b78ed64042cc307603fe80868f20ffd6d4f" exitCode=0 Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.907312 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"98c11f42deb2a855802db6e539c07b78ed64042cc307603fe80868f20ffd6d4f"} Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.907338 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"4bf2e42740725b9d54c8d60efb2a207718601c4c6231f1e898fc274c1b294773"} Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.952486 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-z25nx"] Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.952588 4764 scope.go:117] "RemoveContainer" containerID="0c2cafae36d06d5b16092f588a287b75fbaee2067cdecbe243104d580266304d" Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.959183 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-z25nx"] Oct 01 16:23:22 crc kubenswrapper[4764]: I1001 16:23:22.971294 4764 scope.go:117] "RemoveContainer" containerID="996ccf5d7c8e2755552554ff5a74e5db9102336da04bc8666a5ec3ad70d33d62" Oct 01 16:23:23 crc kubenswrapper[4764]: I1001 16:23:23.733219 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18119c25-297e-4101-855c-1d79282923e7" path="/var/lib/kubelet/pods/18119c25-297e-4101-855c-1d79282923e7/volumes" Oct 01 16:23:29 crc kubenswrapper[4764]: I1001 16:23:29.982643 4764 generic.go:334] "Generic (PLEG): container finished" podID="17487462-b952-4428-a875-61732b895017" containerID="9f0bf694dc65bd06dbb18c2c5aae8ff5428f096ef978c1bc195102be83784a3b" exitCode=0 Oct 01 16:23:29 crc kubenswrapper[4764]: I1001 16:23:29.982701 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17487462-b952-4428-a875-61732b895017","Type":"ContainerDied","Data":"9f0bf694dc65bd06dbb18c2c5aae8ff5428f096ef978c1bc195102be83784a3b"} Oct 01 16:23:30 crc kubenswrapper[4764]: I1001 16:23:30.993920 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17487462-b952-4428-a875-61732b895017","Type":"ContainerStarted","Data":"9a3eb54c1e3016ce80a86ee233af3a7790cecdbb3e92878422c3cff955e9c24f"} Oct 01 16:23:30 crc kubenswrapper[4764]: I1001 16:23:30.994390 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.032027 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.032000199 podStartE2EDuration="37.032000199s" podCreationTimestamp="2025-10-01 16:22:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:23:31.030089423 +0000 UTC m=+1274.029736278" watchObservedRunningTime="2025-10-01 16:23:31.032000199 +0000 UTC m=+1274.031647074" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.841858 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz"] Oct 01 16:23:31 crc kubenswrapper[4764]: E1001 16:23:31.842534 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18119c25-297e-4101-855c-1d79282923e7" containerName="init" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.842637 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="18119c25-297e-4101-855c-1d79282923e7" containerName="init" Oct 01 16:23:31 crc kubenswrapper[4764]: E1001 16:23:31.842722 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2a8617-a485-4228-94ff-874d395bc9a8" containerName="dnsmasq-dns" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.842788 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2a8617-a485-4228-94ff-874d395bc9a8" containerName="dnsmasq-dns" Oct 01 16:23:31 crc kubenswrapper[4764]: E1001 16:23:31.842881 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18119c25-297e-4101-855c-1d79282923e7" containerName="dnsmasq-dns" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.843365 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="18119c25-297e-4101-855c-1d79282923e7" containerName="dnsmasq-dns" Oct 01 16:23:31 crc kubenswrapper[4764]: E1001 16:23:31.843476 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2a8617-a485-4228-94ff-874d395bc9a8" containerName="init" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.843541 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2a8617-a485-4228-94ff-874d395bc9a8" containerName="init" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.843753 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="18119c25-297e-4101-855c-1d79282923e7" containerName="dnsmasq-dns" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.843829 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2a8617-a485-4228-94ff-874d395bc9a8" containerName="dnsmasq-dns" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.844711 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.848922 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.849079 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.852763 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.853832 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.871173 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz"] Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.948210 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.948281 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.948324 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:31 crc kubenswrapper[4764]: I1001 16:23:31.948511 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxxdx\" (UniqueName: \"kubernetes.io/projected/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-kube-api-access-bxxdx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:32 crc kubenswrapper[4764]: I1001 16:23:32.003844 4764 generic.go:334] "Generic (PLEG): container finished" podID="e6c99317-e5aa-4c87-a45a-34e4d14846e4" containerID="ac5bfe6e20abbb15dee31b81bf0b274566ec0a72792433ef17e85cd583e19cca" exitCode=0 Oct 01 16:23:32 crc kubenswrapper[4764]: I1001 16:23:32.003944 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e6c99317-e5aa-4c87-a45a-34e4d14846e4","Type":"ContainerDied","Data":"ac5bfe6e20abbb15dee31b81bf0b274566ec0a72792433ef17e85cd583e19cca"} Oct 01 16:23:32 crc kubenswrapper[4764]: I1001 16:23:32.050811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:32 crc kubenswrapper[4764]: I1001 16:23:32.050921 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:32 crc kubenswrapper[4764]: I1001 16:23:32.051026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxxdx\" (UniqueName: \"kubernetes.io/projected/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-kube-api-access-bxxdx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:32 crc kubenswrapper[4764]: I1001 16:23:32.051084 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:32 crc kubenswrapper[4764]: I1001 16:23:32.060118 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:32 crc kubenswrapper[4764]: I1001 16:23:32.062368 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:32 crc kubenswrapper[4764]: I1001 16:23:32.065897 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:32 crc kubenswrapper[4764]: I1001 16:23:32.080555 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxxdx\" (UniqueName: \"kubernetes.io/projected/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-kube-api-access-bxxdx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:32 crc kubenswrapper[4764]: I1001 16:23:32.171313 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:32 crc kubenswrapper[4764]: I1001 16:23:32.736996 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz"] Oct 01 16:23:32 crc kubenswrapper[4764]: W1001 16:23:32.742467 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcb0612b_19c6_49e5_992d_d7fb9869eeeb.slice/crio-bc970fe65569cee2116ea7bc93dc99e8f718ffb11625554c7ec66fcb2806fcf2 WatchSource:0}: Error finding container bc970fe65569cee2116ea7bc93dc99e8f718ffb11625554c7ec66fcb2806fcf2: Status 404 returned error can't find the container with id bc970fe65569cee2116ea7bc93dc99e8f718ffb11625554c7ec66fcb2806fcf2 Oct 01 16:23:32 crc kubenswrapper[4764]: I1001 16:23:32.745177 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 16:23:33 crc kubenswrapper[4764]: I1001 16:23:33.015571 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" event={"ID":"bcb0612b-19c6-49e5-992d-d7fb9869eeeb","Type":"ContainerStarted","Data":"bc970fe65569cee2116ea7bc93dc99e8f718ffb11625554c7ec66fcb2806fcf2"} Oct 01 16:23:33 crc kubenswrapper[4764]: I1001 16:23:33.018670 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e6c99317-e5aa-4c87-a45a-34e4d14846e4","Type":"ContainerStarted","Data":"07fb91082de42f7851d9c1eff04e140ee579b29d1d362eb9dd4ca1e20f4081f1"} Oct 01 16:23:33 crc kubenswrapper[4764]: I1001 16:23:33.018908 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:23:33 crc kubenswrapper[4764]: I1001 16:23:33.050454 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.050428911 podStartE2EDuration="37.050428911s" podCreationTimestamp="2025-10-01 16:22:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:23:33.042861774 +0000 UTC m=+1276.042508639" watchObservedRunningTime="2025-10-01 16:23:33.050428911 +0000 UTC m=+1276.050075756" Oct 01 16:23:45 crc kubenswrapper[4764]: I1001 16:23:45.162433 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" event={"ID":"bcb0612b-19c6-49e5-992d-d7fb9869eeeb","Type":"ContainerStarted","Data":"d73bb60f2e6eda5d30dc3cf612c93648ffc730a42ce6f351193af2502b409bd5"} Oct 01 16:23:45 crc kubenswrapper[4764]: I1001 16:23:45.194678 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" podStartSLOduration=2.736038293 podStartE2EDuration="14.194651285s" podCreationTimestamp="2025-10-01 16:23:31 +0000 UTC" firstStartedPulling="2025-10-01 16:23:32.744856456 +0000 UTC m=+1275.744503291" lastFinishedPulling="2025-10-01 16:23:44.203469418 +0000 UTC m=+1287.203116283" observedRunningTime="2025-10-01 16:23:45.179752059 +0000 UTC m=+1288.179398924" watchObservedRunningTime="2025-10-01 16:23:45.194651285 +0000 UTC m=+1288.194298160" Oct 01 16:23:45 crc kubenswrapper[4764]: I1001 16:23:45.238724 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 16:23:46 crc kubenswrapper[4764]: I1001 16:23:46.948378 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 16:23:55 crc kubenswrapper[4764]: I1001 16:23:55.286633 4764 generic.go:334] "Generic (PLEG): container finished" podID="bcb0612b-19c6-49e5-992d-d7fb9869eeeb" containerID="d73bb60f2e6eda5d30dc3cf612c93648ffc730a42ce6f351193af2502b409bd5" exitCode=0 Oct 01 16:23:55 crc kubenswrapper[4764]: I1001 16:23:55.286770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" event={"ID":"bcb0612b-19c6-49e5-992d-d7fb9869eeeb","Type":"ContainerDied","Data":"d73bb60f2e6eda5d30dc3cf612c93648ffc730a42ce6f351193af2502b409bd5"} Oct 01 16:23:56 crc kubenswrapper[4764]: I1001 16:23:56.742503 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:56 crc kubenswrapper[4764]: I1001 16:23:56.866972 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-ssh-key\") pod \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " Oct 01 16:23:56 crc kubenswrapper[4764]: I1001 16:23:56.867207 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-inventory\") pod \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " Oct 01 16:23:56 crc kubenswrapper[4764]: I1001 16:23:56.867276 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-repo-setup-combined-ca-bundle\") pod \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " Oct 01 16:23:56 crc kubenswrapper[4764]: I1001 16:23:56.867322 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxxdx\" (UniqueName: \"kubernetes.io/projected/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-kube-api-access-bxxdx\") pod \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\" (UID: \"bcb0612b-19c6-49e5-992d-d7fb9869eeeb\") " Oct 01 16:23:56 crc kubenswrapper[4764]: I1001 16:23:56.875123 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-kube-api-access-bxxdx" (OuterVolumeSpecName: "kube-api-access-bxxdx") pod "bcb0612b-19c6-49e5-992d-d7fb9869eeeb" (UID: "bcb0612b-19c6-49e5-992d-d7fb9869eeeb"). InnerVolumeSpecName "kube-api-access-bxxdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:23:56 crc kubenswrapper[4764]: I1001 16:23:56.880728 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "bcb0612b-19c6-49e5-992d-d7fb9869eeeb" (UID: "bcb0612b-19c6-49e5-992d-d7fb9869eeeb"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:23:56 crc kubenswrapper[4764]: I1001 16:23:56.909760 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-inventory" (OuterVolumeSpecName: "inventory") pod "bcb0612b-19c6-49e5-992d-d7fb9869eeeb" (UID: "bcb0612b-19c6-49e5-992d-d7fb9869eeeb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:23:56 crc kubenswrapper[4764]: I1001 16:23:56.911239 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bcb0612b-19c6-49e5-992d-d7fb9869eeeb" (UID: "bcb0612b-19c6-49e5-992d-d7fb9869eeeb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:23:56 crc kubenswrapper[4764]: I1001 16:23:56.969643 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:56 crc kubenswrapper[4764]: I1001 16:23:56.969694 4764 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:56 crc kubenswrapper[4764]: I1001 16:23:56.969716 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxxdx\" (UniqueName: \"kubernetes.io/projected/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-kube-api-access-bxxdx\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:56 crc kubenswrapper[4764]: I1001 16:23:56.969737 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bcb0612b-19c6-49e5-992d-d7fb9869eeeb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.317085 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" event={"ID":"bcb0612b-19c6-49e5-992d-d7fb9869eeeb","Type":"ContainerDied","Data":"bc970fe65569cee2116ea7bc93dc99e8f718ffb11625554c7ec66fcb2806fcf2"} Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.317459 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc970fe65569cee2116ea7bc93dc99e8f718ffb11625554c7ec66fcb2806fcf2" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.317323 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.406083 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j"] Oct 01 16:23:57 crc kubenswrapper[4764]: E1001 16:23:57.406518 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb0612b-19c6-49e5-992d-d7fb9869eeeb" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.406534 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb0612b-19c6-49e5-992d-d7fb9869eeeb" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.406775 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb0612b-19c6-49e5-992d-d7fb9869eeeb" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.407476 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.409614 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.410947 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.411256 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.412665 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.424659 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j"] Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.581965 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.582041 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.582168 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.582229 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5p52\" (UniqueName: \"kubernetes.io/projected/66071483-0a25-4d14-afea-3f08fe54ddc5-kube-api-access-m5p52\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.684713 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.684780 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5p52\" (UniqueName: \"kubernetes.io/projected/66071483-0a25-4d14-afea-3f08fe54ddc5-kube-api-access-m5p52\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.684865 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.685001 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.689810 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.690144 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.695821 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.703894 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5p52\" (UniqueName: \"kubernetes.io/projected/66071483-0a25-4d14-afea-3f08fe54ddc5-kube-api-access-m5p52\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:57 crc kubenswrapper[4764]: I1001 16:23:57.730194 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:23:58 crc kubenswrapper[4764]: W1001 16:23:58.086725 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66071483_0a25_4d14_afea_3f08fe54ddc5.slice/crio-420245b555ae691141edcdccf0232637811fb0719cc461df1bae80f9948e8e93 WatchSource:0}: Error finding container 420245b555ae691141edcdccf0232637811fb0719cc461df1bae80f9948e8e93: Status 404 returned error can't find the container with id 420245b555ae691141edcdccf0232637811fb0719cc461df1bae80f9948e8e93 Oct 01 16:23:58 crc kubenswrapper[4764]: I1001 16:23:58.091976 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j"] Oct 01 16:23:58 crc kubenswrapper[4764]: I1001 16:23:58.329701 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" event={"ID":"66071483-0a25-4d14-afea-3f08fe54ddc5","Type":"ContainerStarted","Data":"420245b555ae691141edcdccf0232637811fb0719cc461df1bae80f9948e8e93"} Oct 01 16:23:59 crc kubenswrapper[4764]: I1001 16:23:59.342405 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" event={"ID":"66071483-0a25-4d14-afea-3f08fe54ddc5","Type":"ContainerStarted","Data":"cb90743ae2f3ca72f211a1043c24b28326f767dde457945e57513182d60721d0"} Oct 01 16:23:59 crc kubenswrapper[4764]: I1001 16:23:59.373481 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" podStartSLOduration=1.931270011 podStartE2EDuration="2.373460689s" podCreationTimestamp="2025-10-01 16:23:57 +0000 UTC" firstStartedPulling="2025-10-01 16:23:58.08975933 +0000 UTC m=+1301.089406185" lastFinishedPulling="2025-10-01 16:23:58.531950038 +0000 UTC m=+1301.531596863" observedRunningTime="2025-10-01 16:23:59.365839182 +0000 UTC m=+1302.365486017" watchObservedRunningTime="2025-10-01 16:23:59.373460689 +0000 UTC m=+1302.373107524" Oct 01 16:24:43 crc kubenswrapper[4764]: I1001 16:24:43.971928 4764 scope.go:117] "RemoveContainer" containerID="4b2caa0febd16d804f1c435f43a3182c494d53301c7d0b8255e18cd8268fc965" Oct 01 16:25:44 crc kubenswrapper[4764]: I1001 16:25:44.049524 4764 scope.go:117] "RemoveContainer" containerID="790fd192ed7fdab21806ebc4f57bd21352d68d0579e641b72cd2e538b04b0473" Oct 01 16:25:44 crc kubenswrapper[4764]: I1001 16:25:44.087743 4764 scope.go:117] "RemoveContainer" containerID="21abde22172561a77c076a728487da7636033b20ec7df2a9d7bc3b4386028293" Oct 01 16:25:44 crc kubenswrapper[4764]: I1001 16:25:44.131115 4764 scope.go:117] "RemoveContainer" containerID="f5690a0515e0af167e6266fcd08780bf17da3f2d432789175acf137324d679e7" Oct 01 16:25:51 crc kubenswrapper[4764]: I1001 16:25:51.914002 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:25:51 crc kubenswrapper[4764]: I1001 16:25:51.914674 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:26:01 crc kubenswrapper[4764]: I1001 16:26:01.550258 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6b8k2"] Oct 01 16:26:01 crc kubenswrapper[4764]: I1001 16:26:01.554461 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:01 crc kubenswrapper[4764]: I1001 16:26:01.560717 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6b8k2"] Oct 01 16:26:01 crc kubenswrapper[4764]: I1001 16:26:01.700744 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1952fe9-1eeb-48df-a902-99ca6708f92d-utilities\") pod \"redhat-operators-6b8k2\" (UID: \"a1952fe9-1eeb-48df-a902-99ca6708f92d\") " pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:01 crc kubenswrapper[4764]: I1001 16:26:01.700839 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7jqv\" (UniqueName: \"kubernetes.io/projected/a1952fe9-1eeb-48df-a902-99ca6708f92d-kube-api-access-s7jqv\") pod \"redhat-operators-6b8k2\" (UID: \"a1952fe9-1eeb-48df-a902-99ca6708f92d\") " pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:01 crc kubenswrapper[4764]: I1001 16:26:01.700909 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1952fe9-1eeb-48df-a902-99ca6708f92d-catalog-content\") pod \"redhat-operators-6b8k2\" (UID: \"a1952fe9-1eeb-48df-a902-99ca6708f92d\") " pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:01 crc kubenswrapper[4764]: I1001 16:26:01.803123 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1952fe9-1eeb-48df-a902-99ca6708f92d-catalog-content\") pod \"redhat-operators-6b8k2\" (UID: \"a1952fe9-1eeb-48df-a902-99ca6708f92d\") " pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:01 crc kubenswrapper[4764]: I1001 16:26:01.803346 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1952fe9-1eeb-48df-a902-99ca6708f92d-utilities\") pod \"redhat-operators-6b8k2\" (UID: \"a1952fe9-1eeb-48df-a902-99ca6708f92d\") " pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:01 crc kubenswrapper[4764]: I1001 16:26:01.803439 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7jqv\" (UniqueName: \"kubernetes.io/projected/a1952fe9-1eeb-48df-a902-99ca6708f92d-kube-api-access-s7jqv\") pod \"redhat-operators-6b8k2\" (UID: \"a1952fe9-1eeb-48df-a902-99ca6708f92d\") " pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:01 crc kubenswrapper[4764]: I1001 16:26:01.803729 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1952fe9-1eeb-48df-a902-99ca6708f92d-catalog-content\") pod \"redhat-operators-6b8k2\" (UID: \"a1952fe9-1eeb-48df-a902-99ca6708f92d\") " pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:01 crc kubenswrapper[4764]: I1001 16:26:01.804093 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1952fe9-1eeb-48df-a902-99ca6708f92d-utilities\") pod \"redhat-operators-6b8k2\" (UID: \"a1952fe9-1eeb-48df-a902-99ca6708f92d\") " pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:01 crc kubenswrapper[4764]: I1001 16:26:01.828121 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7jqv\" (UniqueName: \"kubernetes.io/projected/a1952fe9-1eeb-48df-a902-99ca6708f92d-kube-api-access-s7jqv\") pod \"redhat-operators-6b8k2\" (UID: \"a1952fe9-1eeb-48df-a902-99ca6708f92d\") " pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:01 crc kubenswrapper[4764]: I1001 16:26:01.890871 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:02 crc kubenswrapper[4764]: I1001 16:26:02.371665 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6b8k2"] Oct 01 16:26:02 crc kubenswrapper[4764]: I1001 16:26:02.672019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b8k2" event={"ID":"a1952fe9-1eeb-48df-a902-99ca6708f92d","Type":"ContainerStarted","Data":"29473e094b389c2c5896304628e8f8f55bc26a9ac5d566e7478fb35dc2054ca4"} Oct 01 16:26:02 crc kubenswrapper[4764]: I1001 16:26:02.672358 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b8k2" event={"ID":"a1952fe9-1eeb-48df-a902-99ca6708f92d","Type":"ContainerStarted","Data":"5326af3d6ae5b2bc22be4eedc6a76cb09a59ee7836c7c9d398320afc7f82970e"} Oct 01 16:26:02 crc kubenswrapper[4764]: E1001 16:26:02.773589 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1952fe9_1eeb_48df_a902_99ca6708f92d.slice/crio-conmon-29473e094b389c2c5896304628e8f8f55bc26a9ac5d566e7478fb35dc2054ca4.scope\": RecentStats: unable to find data in memory cache]" Oct 01 16:26:03 crc kubenswrapper[4764]: I1001 16:26:03.683372 4764 generic.go:334] "Generic (PLEG): container finished" podID="a1952fe9-1eeb-48df-a902-99ca6708f92d" containerID="29473e094b389c2c5896304628e8f8f55bc26a9ac5d566e7478fb35dc2054ca4" exitCode=0 Oct 01 16:26:03 crc kubenswrapper[4764]: I1001 16:26:03.683618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b8k2" event={"ID":"a1952fe9-1eeb-48df-a902-99ca6708f92d","Type":"ContainerDied","Data":"29473e094b389c2c5896304628e8f8f55bc26a9ac5d566e7478fb35dc2054ca4"} Oct 01 16:26:12 crc kubenswrapper[4764]: I1001 16:26:12.780872 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b8k2" event={"ID":"a1952fe9-1eeb-48df-a902-99ca6708f92d","Type":"ContainerStarted","Data":"02cc9ccefb556142af060573f9e03c1cc31a3164581e38cf10ea9899855ca93f"} Oct 01 16:26:13 crc kubenswrapper[4764]: I1001 16:26:13.799413 4764 generic.go:334] "Generic (PLEG): container finished" podID="a1952fe9-1eeb-48df-a902-99ca6708f92d" containerID="02cc9ccefb556142af060573f9e03c1cc31a3164581e38cf10ea9899855ca93f" exitCode=0 Oct 01 16:26:13 crc kubenswrapper[4764]: I1001 16:26:13.799588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b8k2" event={"ID":"a1952fe9-1eeb-48df-a902-99ca6708f92d","Type":"ContainerDied","Data":"02cc9ccefb556142af060573f9e03c1cc31a3164581e38cf10ea9899855ca93f"} Oct 01 16:26:14 crc kubenswrapper[4764]: I1001 16:26:14.815508 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b8k2" event={"ID":"a1952fe9-1eeb-48df-a902-99ca6708f92d","Type":"ContainerStarted","Data":"36cb50f081a79e377af3dac542b26708fffff7cf3a1354f7b3036c465c6d70c1"} Oct 01 16:26:14 crc kubenswrapper[4764]: I1001 16:26:14.846393 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6b8k2" podStartSLOduration=2.267779708 podStartE2EDuration="13.846367401s" podCreationTimestamp="2025-10-01 16:26:01 +0000 UTC" firstStartedPulling="2025-10-01 16:26:02.674091936 +0000 UTC m=+1425.673738771" lastFinishedPulling="2025-10-01 16:26:14.252679599 +0000 UTC m=+1437.252326464" observedRunningTime="2025-10-01 16:26:14.837089233 +0000 UTC m=+1437.836736108" watchObservedRunningTime="2025-10-01 16:26:14.846367401 +0000 UTC m=+1437.846014266" Oct 01 16:26:21 crc kubenswrapper[4764]: I1001 16:26:21.892202 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:21 crc kubenswrapper[4764]: I1001 16:26:21.894424 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:21 crc kubenswrapper[4764]: I1001 16:26:21.914094 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:26:21 crc kubenswrapper[4764]: I1001 16:26:21.914156 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:26:21 crc kubenswrapper[4764]: I1001 16:26:21.979193 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.224801 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vtch5"] Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.226577 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.261310 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vtch5"] Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.323764 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd5c30-3b37-486a-bc94-e0132899aa8d-catalog-content\") pod \"community-operators-vtch5\" (UID: \"85bd5c30-3b37-486a-bc94-e0132899aa8d\") " pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.323951 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd5c30-3b37-486a-bc94-e0132899aa8d-utilities\") pod \"community-operators-vtch5\" (UID: \"85bd5c30-3b37-486a-bc94-e0132899aa8d\") " pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.324145 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snsqf\" (UniqueName: \"kubernetes.io/projected/85bd5c30-3b37-486a-bc94-e0132899aa8d-kube-api-access-snsqf\") pod \"community-operators-vtch5\" (UID: \"85bd5c30-3b37-486a-bc94-e0132899aa8d\") " pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.425511 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd5c30-3b37-486a-bc94-e0132899aa8d-utilities\") pod \"community-operators-vtch5\" (UID: \"85bd5c30-3b37-486a-bc94-e0132899aa8d\") " pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.425911 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snsqf\" (UniqueName: \"kubernetes.io/projected/85bd5c30-3b37-486a-bc94-e0132899aa8d-kube-api-access-snsqf\") pod \"community-operators-vtch5\" (UID: \"85bd5c30-3b37-486a-bc94-e0132899aa8d\") " pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.425960 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd5c30-3b37-486a-bc94-e0132899aa8d-utilities\") pod \"community-operators-vtch5\" (UID: \"85bd5c30-3b37-486a-bc94-e0132899aa8d\") " pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.425972 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd5c30-3b37-486a-bc94-e0132899aa8d-catalog-content\") pod \"community-operators-vtch5\" (UID: \"85bd5c30-3b37-486a-bc94-e0132899aa8d\") " pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.426328 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd5c30-3b37-486a-bc94-e0132899aa8d-catalog-content\") pod \"community-operators-vtch5\" (UID: \"85bd5c30-3b37-486a-bc94-e0132899aa8d\") " pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.447920 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snsqf\" (UniqueName: \"kubernetes.io/projected/85bd5c30-3b37-486a-bc94-e0132899aa8d-kube-api-access-snsqf\") pod \"community-operators-vtch5\" (UID: \"85bd5c30-3b37-486a-bc94-e0132899aa8d\") " pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.548987 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:22 crc kubenswrapper[4764]: I1001 16:26:22.964750 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 16:26:23 crc kubenswrapper[4764]: I1001 16:26:23.018916 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vtch5"] Oct 01 16:26:23 crc kubenswrapper[4764]: W1001 16:26:23.021464 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85bd5c30_3b37_486a_bc94_e0132899aa8d.slice/crio-3b81b8291fa1597e14afcad3f870267e563bf7054bbf6216e8d3a96358837e31 WatchSource:0}: Error finding container 3b81b8291fa1597e14afcad3f870267e563bf7054bbf6216e8d3a96358837e31: Status 404 returned error can't find the container with id 3b81b8291fa1597e14afcad3f870267e563bf7054bbf6216e8d3a96358837e31 Oct 01 16:26:23 crc kubenswrapper[4764]: I1001 16:26:23.934540 4764 generic.go:334] "Generic (PLEG): container finished" podID="85bd5c30-3b37-486a-bc94-e0132899aa8d" containerID="f43ef0f452496867cbc0d9daf70adee2678fb1d6a338a4014432558a08845309" exitCode=0 Oct 01 16:26:23 crc kubenswrapper[4764]: I1001 16:26:23.934650 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vtch5" event={"ID":"85bd5c30-3b37-486a-bc94-e0132899aa8d","Type":"ContainerDied","Data":"f43ef0f452496867cbc0d9daf70adee2678fb1d6a338a4014432558a08845309"} Oct 01 16:26:23 crc kubenswrapper[4764]: I1001 16:26:23.935076 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vtch5" event={"ID":"85bd5c30-3b37-486a-bc94-e0132899aa8d","Type":"ContainerStarted","Data":"3b81b8291fa1597e14afcad3f870267e563bf7054bbf6216e8d3a96358837e31"} Oct 01 16:26:24 crc kubenswrapper[4764]: I1001 16:26:24.862021 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6b8k2"] Oct 01 16:26:25 crc kubenswrapper[4764]: I1001 16:26:25.220633 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-45tgq"] Oct 01 16:26:25 crc kubenswrapper[4764]: I1001 16:26:25.220890 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-45tgq" podUID="e13b9e77-673e-4b69-93e4-0bc9ab7fe544" containerName="registry-server" containerID="cri-o://9cb143463121a4034e654ba91e76e2d7ae1c04b89073105055870c8783aa10a8" gracePeriod=2 Oct 01 16:26:25 crc kubenswrapper[4764]: E1001 16:26:25.948812 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9cb143463121a4034e654ba91e76e2d7ae1c04b89073105055870c8783aa10a8 is running failed: container process not found" containerID="9cb143463121a4034e654ba91e76e2d7ae1c04b89073105055870c8783aa10a8" cmd=["grpc_health_probe","-addr=:50051"] Oct 01 16:26:25 crc kubenswrapper[4764]: E1001 16:26:25.950068 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9cb143463121a4034e654ba91e76e2d7ae1c04b89073105055870c8783aa10a8 is running failed: container process not found" containerID="9cb143463121a4034e654ba91e76e2d7ae1c04b89073105055870c8783aa10a8" cmd=["grpc_health_probe","-addr=:50051"] Oct 01 16:26:25 crc kubenswrapper[4764]: E1001 16:26:25.950510 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9cb143463121a4034e654ba91e76e2d7ae1c04b89073105055870c8783aa10a8 is running failed: container process not found" containerID="9cb143463121a4034e654ba91e76e2d7ae1c04b89073105055870c8783aa10a8" cmd=["grpc_health_probe","-addr=:50051"] Oct 01 16:26:25 crc kubenswrapper[4764]: E1001 16:26:25.950607 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9cb143463121a4034e654ba91e76e2d7ae1c04b89073105055870c8783aa10a8 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-45tgq" podUID="e13b9e77-673e-4b69-93e4-0bc9ab7fe544" containerName="registry-server" Oct 01 16:26:25 crc kubenswrapper[4764]: I1001 16:26:25.961769 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vtch5" event={"ID":"85bd5c30-3b37-486a-bc94-e0132899aa8d","Type":"ContainerStarted","Data":"98eab8c49b1759837c2d73fc57c4204fd63be16a99d5dad15a0bf5a633e87733"} Oct 01 16:26:25 crc kubenswrapper[4764]: I1001 16:26:25.967972 4764 generic.go:334] "Generic (PLEG): container finished" podID="e13b9e77-673e-4b69-93e4-0bc9ab7fe544" containerID="9cb143463121a4034e654ba91e76e2d7ae1c04b89073105055870c8783aa10a8" exitCode=0 Oct 01 16:26:25 crc kubenswrapper[4764]: I1001 16:26:25.968014 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tgq" event={"ID":"e13b9e77-673e-4b69-93e4-0bc9ab7fe544","Type":"ContainerDied","Data":"9cb143463121a4034e654ba91e76e2d7ae1c04b89073105055870c8783aa10a8"} Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.562328 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.602193 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z2v2\" (UniqueName: \"kubernetes.io/projected/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-kube-api-access-4z2v2\") pod \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\" (UID: \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\") " Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.602339 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-catalog-content\") pod \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\" (UID: \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\") " Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.602473 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-utilities\") pod \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\" (UID: \"e13b9e77-673e-4b69-93e4-0bc9ab7fe544\") " Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.603687 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-utilities" (OuterVolumeSpecName: "utilities") pod "e13b9e77-673e-4b69-93e4-0bc9ab7fe544" (UID: "e13b9e77-673e-4b69-93e4-0bc9ab7fe544"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.609274 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-kube-api-access-4z2v2" (OuterVolumeSpecName: "kube-api-access-4z2v2") pod "e13b9e77-673e-4b69-93e4-0bc9ab7fe544" (UID: "e13b9e77-673e-4b69-93e4-0bc9ab7fe544"). InnerVolumeSpecName "kube-api-access-4z2v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.680304 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e13b9e77-673e-4b69-93e4-0bc9ab7fe544" (UID: "e13b9e77-673e-4b69-93e4-0bc9ab7fe544"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.705176 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z2v2\" (UniqueName: \"kubernetes.io/projected/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-kube-api-access-4z2v2\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.705214 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.705226 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e13b9e77-673e-4b69-93e4-0bc9ab7fe544-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.978341 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45tgq" event={"ID":"e13b9e77-673e-4b69-93e4-0bc9ab7fe544","Type":"ContainerDied","Data":"ddc7c1d1148dae70a88806937413181257473f7645fb6b8899d0d7fba32bc481"} Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.978386 4764 scope.go:117] "RemoveContainer" containerID="9cb143463121a4034e654ba91e76e2d7ae1c04b89073105055870c8783aa10a8" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.978398 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45tgq" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.981116 4764 generic.go:334] "Generic (PLEG): container finished" podID="85bd5c30-3b37-486a-bc94-e0132899aa8d" containerID="98eab8c49b1759837c2d73fc57c4204fd63be16a99d5dad15a0bf5a633e87733" exitCode=0 Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:26.981161 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vtch5" event={"ID":"85bd5c30-3b37-486a-bc94-e0132899aa8d","Type":"ContainerDied","Data":"98eab8c49b1759837c2d73fc57c4204fd63be16a99d5dad15a0bf5a633e87733"} Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.035510 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-45tgq"] Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.044726 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-45tgq"] Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.056954 4764 scope.go:117] "RemoveContainer" containerID="9b69d7ec6eeeebecaa7414817e6d9f3df4d18d3ab60f0fb791e7d78300e5025d" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.182183 4764 scope.go:117] "RemoveContainer" containerID="2273e6f657846bd9fb9e82687503a091491e1a4868a78349cf310450e31ca8dd" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.490870 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql"] Oct 01 16:26:28 crc kubenswrapper[4764]: E1001 16:26:27.491831 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13b9e77-673e-4b69-93e4-0bc9ab7fe544" containerName="registry-server" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.491853 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13b9e77-673e-4b69-93e4-0bc9ab7fe544" containerName="registry-server" Oct 01 16:26:28 crc kubenswrapper[4764]: E1001 16:26:27.491889 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13b9e77-673e-4b69-93e4-0bc9ab7fe544" containerName="extract-content" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.491900 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13b9e77-673e-4b69-93e4-0bc9ab7fe544" containerName="extract-content" Oct 01 16:26:28 crc kubenswrapper[4764]: E1001 16:26:27.491944 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13b9e77-673e-4b69-93e4-0bc9ab7fe544" containerName="extract-utilities" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.491955 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13b9e77-673e-4b69-93e4-0bc9ab7fe544" containerName="extract-utilities" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.492205 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e13b9e77-673e-4b69-93e4-0bc9ab7fe544" containerName="registry-server" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.493867 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.498630 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.503636 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql"] Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.620740 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql\" (UID: \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.620801 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql\" (UID: \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.620834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp7cw\" (UniqueName: \"kubernetes.io/projected/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-kube-api-access-zp7cw\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql\" (UID: \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.683621 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22"] Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.685892 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.704645 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22"] Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.722972 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql\" (UID: \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.723027 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql\" (UID: \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.723072 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp7cw\" (UniqueName: \"kubernetes.io/projected/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-kube-api-access-zp7cw\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql\" (UID: \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.723976 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql\" (UID: \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.723990 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql\" (UID: \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.733756 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e13b9e77-673e-4b69-93e4-0bc9ab7fe544" path="/var/lib/kubelet/pods/e13b9e77-673e-4b69-93e4-0bc9ab7fe544/volumes" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.741371 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp7cw\" (UniqueName: \"kubernetes.io/projected/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-kube-api-access-zp7cw\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql\" (UID: \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.822858 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.824340 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d76e759-050c-4e98-b79c-6eb25431c21e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22\" (UID: \"4d76e759-050c-4e98-b79c-6eb25431c21e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.824385 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4spsq\" (UniqueName: \"kubernetes.io/projected/4d76e759-050c-4e98-b79c-6eb25431c21e-kube-api-access-4spsq\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22\" (UID: \"4d76e759-050c-4e98-b79c-6eb25431c21e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.824944 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d76e759-050c-4e98-b79c-6eb25431c21e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22\" (UID: \"4d76e759-050c-4e98-b79c-6eb25431c21e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.927297 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d76e759-050c-4e98-b79c-6eb25431c21e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22\" (UID: \"4d76e759-050c-4e98-b79c-6eb25431c21e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.927416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d76e759-050c-4e98-b79c-6eb25431c21e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22\" (UID: \"4d76e759-050c-4e98-b79c-6eb25431c21e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.927448 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4spsq\" (UniqueName: \"kubernetes.io/projected/4d76e759-050c-4e98-b79c-6eb25431c21e-kube-api-access-4spsq\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22\" (UID: \"4d76e759-050c-4e98-b79c-6eb25431c21e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.927789 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d76e759-050c-4e98-b79c-6eb25431c21e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22\" (UID: \"4d76e759-050c-4e98-b79c-6eb25431c21e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.928078 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d76e759-050c-4e98-b79c-6eb25431c21e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22\" (UID: \"4d76e759-050c-4e98-b79c-6eb25431c21e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:27.945843 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4spsq\" (UniqueName: \"kubernetes.io/projected/4d76e759-050c-4e98-b79c-6eb25431c21e-kube-api-access-4spsq\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22\" (UID: \"4d76e759-050c-4e98-b79c-6eb25431c21e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" Oct 01 16:26:28 crc kubenswrapper[4764]: I1001 16:26:28.003040 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" Oct 01 16:26:29 crc kubenswrapper[4764]: I1001 16:26:29.007378 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vtch5" event={"ID":"85bd5c30-3b37-486a-bc94-e0132899aa8d","Type":"ContainerStarted","Data":"86ae4b3d429093aba5e2b524484fda3c32b07cfd810b522fe0c845cf4740cf7f"} Oct 01 16:26:29 crc kubenswrapper[4764]: I1001 16:26:29.036932 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vtch5" podStartSLOduration=2.631673159 podStartE2EDuration="7.036905519s" podCreationTimestamp="2025-10-01 16:26:22 +0000 UTC" firstStartedPulling="2025-10-01 16:26:23.936959672 +0000 UTC m=+1446.936606537" lastFinishedPulling="2025-10-01 16:26:28.342192062 +0000 UTC m=+1451.341838897" observedRunningTime="2025-10-01 16:26:29.025214192 +0000 UTC m=+1452.024861067" watchObservedRunningTime="2025-10-01 16:26:29.036905519 +0000 UTC m=+1452.036552364" Oct 01 16:26:29 crc kubenswrapper[4764]: I1001 16:26:29.218158 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22"] Oct 01 16:26:29 crc kubenswrapper[4764]: I1001 16:26:29.227761 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql"] Oct 01 16:26:29 crc kubenswrapper[4764]: W1001 16:26:29.229906 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3673e54_d98b_45f9_a98f_8ccb4e65ccf9.slice/crio-ece4ee6132ea7718de710accb260f283f0aff0de06de8b539e29eedfbd44e06c WatchSource:0}: Error finding container ece4ee6132ea7718de710accb260f283f0aff0de06de8b539e29eedfbd44e06c: Status 404 returned error can't find the container with id ece4ee6132ea7718de710accb260f283f0aff0de06de8b539e29eedfbd44e06c Oct 01 16:26:30 crc kubenswrapper[4764]: I1001 16:26:30.022255 4764 generic.go:334] "Generic (PLEG): container finished" podID="b3673e54-d98b-45f9-a98f-8ccb4e65ccf9" containerID="520beaa13f56ef760e7fd32ff4d5ca2e9ae634bc078633a1aaaaf7b776f4a177" exitCode=0 Oct 01 16:26:30 crc kubenswrapper[4764]: I1001 16:26:30.022777 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" event={"ID":"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9","Type":"ContainerDied","Data":"520beaa13f56ef760e7fd32ff4d5ca2e9ae634bc078633a1aaaaf7b776f4a177"} Oct 01 16:26:30 crc kubenswrapper[4764]: I1001 16:26:30.022834 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" event={"ID":"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9","Type":"ContainerStarted","Data":"ece4ee6132ea7718de710accb260f283f0aff0de06de8b539e29eedfbd44e06c"} Oct 01 16:26:30 crc kubenswrapper[4764]: I1001 16:26:30.026959 4764 generic.go:334] "Generic (PLEG): container finished" podID="4d76e759-050c-4e98-b79c-6eb25431c21e" containerID="c6b19a37402d9a9ae75a5c9cf24d0ecb8d8b36120a507f86a9fdf0c68a5e06a9" exitCode=0 Oct 01 16:26:30 crc kubenswrapper[4764]: I1001 16:26:30.027030 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" event={"ID":"4d76e759-050c-4e98-b79c-6eb25431c21e","Type":"ContainerDied","Data":"c6b19a37402d9a9ae75a5c9cf24d0ecb8d8b36120a507f86a9fdf0c68a5e06a9"} Oct 01 16:26:30 crc kubenswrapper[4764]: I1001 16:26:30.027087 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" event={"ID":"4d76e759-050c-4e98-b79c-6eb25431c21e","Type":"ContainerStarted","Data":"f2d2ebc72e91a5d63a4ffb1a18d3765c04bc8d3a646a552b3995a512eccdfdb2"} Oct 01 16:26:32 crc kubenswrapper[4764]: I1001 16:26:32.052987 4764 generic.go:334] "Generic (PLEG): container finished" podID="4d76e759-050c-4e98-b79c-6eb25431c21e" containerID="ba354ec068a32fe57fcb12de8671a388ba8803421797e8389e662bde80d9b0bd" exitCode=0 Oct 01 16:26:32 crc kubenswrapper[4764]: I1001 16:26:32.053520 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" event={"ID":"4d76e759-050c-4e98-b79c-6eb25431c21e","Type":"ContainerDied","Data":"ba354ec068a32fe57fcb12de8671a388ba8803421797e8389e662bde80d9b0bd"} Oct 01 16:26:32 crc kubenswrapper[4764]: I1001 16:26:32.057204 4764 generic.go:334] "Generic (PLEG): container finished" podID="b3673e54-d98b-45f9-a98f-8ccb4e65ccf9" containerID="6e0ce138b30d26bb88291eb1feb6a83c8d0329db06fdfe7f79fd945fd7842943" exitCode=0 Oct 01 16:26:32 crc kubenswrapper[4764]: I1001 16:26:32.057267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" event={"ID":"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9","Type":"ContainerDied","Data":"6e0ce138b30d26bb88291eb1feb6a83c8d0329db06fdfe7f79fd945fd7842943"} Oct 01 16:26:32 crc kubenswrapper[4764]: I1001 16:26:32.550368 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:32 crc kubenswrapper[4764]: I1001 16:26:32.550717 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:32 crc kubenswrapper[4764]: I1001 16:26:32.594522 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:33 crc kubenswrapper[4764]: I1001 16:26:33.079655 4764 generic.go:334] "Generic (PLEG): container finished" podID="b3673e54-d98b-45f9-a98f-8ccb4e65ccf9" containerID="9c4794d8602db94e50eac81c1006749393f49e47ba181ac42537905da73846fd" exitCode=0 Oct 01 16:26:33 crc kubenswrapper[4764]: I1001 16:26:33.079737 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" event={"ID":"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9","Type":"ContainerDied","Data":"9c4794d8602db94e50eac81c1006749393f49e47ba181ac42537905da73846fd"} Oct 01 16:26:33 crc kubenswrapper[4764]: I1001 16:26:33.083903 4764 generic.go:334] "Generic (PLEG): container finished" podID="4d76e759-050c-4e98-b79c-6eb25431c21e" containerID="234adbdff51c82df19f546a2ee35d84236c0972a0a46ae595b18faa9eaca5c7d" exitCode=0 Oct 01 16:26:33 crc kubenswrapper[4764]: I1001 16:26:33.084906 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" event={"ID":"4d76e759-050c-4e98-b79c-6eb25431c21e","Type":"ContainerDied","Data":"234adbdff51c82df19f546a2ee35d84236c0972a0a46ae595b18faa9eaca5c7d"} Oct 01 16:26:33 crc kubenswrapper[4764]: I1001 16:26:33.154976 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.460996 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.468808 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.562099 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-bundle\") pod \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\" (UID: \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\") " Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.562261 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-util\") pod \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\" (UID: \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\") " Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.562347 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4spsq\" (UniqueName: \"kubernetes.io/projected/4d76e759-050c-4e98-b79c-6eb25431c21e-kube-api-access-4spsq\") pod \"4d76e759-050c-4e98-b79c-6eb25431c21e\" (UID: \"4d76e759-050c-4e98-b79c-6eb25431c21e\") " Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.562475 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp7cw\" (UniqueName: \"kubernetes.io/projected/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-kube-api-access-zp7cw\") pod \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\" (UID: \"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9\") " Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.562533 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d76e759-050c-4e98-b79c-6eb25431c21e-util\") pod \"4d76e759-050c-4e98-b79c-6eb25431c21e\" (UID: \"4d76e759-050c-4e98-b79c-6eb25431c21e\") " Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.562566 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d76e759-050c-4e98-b79c-6eb25431c21e-bundle\") pod \"4d76e759-050c-4e98-b79c-6eb25431c21e\" (UID: \"4d76e759-050c-4e98-b79c-6eb25431c21e\") " Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.563930 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-bundle" (OuterVolumeSpecName: "bundle") pod "b3673e54-d98b-45f9-a98f-8ccb4e65ccf9" (UID: "b3673e54-d98b-45f9-a98f-8ccb4e65ccf9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.564869 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d76e759-050c-4e98-b79c-6eb25431c21e-bundle" (OuterVolumeSpecName: "bundle") pod "4d76e759-050c-4e98-b79c-6eb25431c21e" (UID: "4d76e759-050c-4e98-b79c-6eb25431c21e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.568262 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-kube-api-access-zp7cw" (OuterVolumeSpecName: "kube-api-access-zp7cw") pod "b3673e54-d98b-45f9-a98f-8ccb4e65ccf9" (UID: "b3673e54-d98b-45f9-a98f-8ccb4e65ccf9"). InnerVolumeSpecName "kube-api-access-zp7cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.571875 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d76e759-050c-4e98-b79c-6eb25431c21e-kube-api-access-4spsq" (OuterVolumeSpecName: "kube-api-access-4spsq") pod "4d76e759-050c-4e98-b79c-6eb25431c21e" (UID: "4d76e759-050c-4e98-b79c-6eb25431c21e"). InnerVolumeSpecName "kube-api-access-4spsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.665094 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4spsq\" (UniqueName: \"kubernetes.io/projected/4d76e759-050c-4e98-b79c-6eb25431c21e-kube-api-access-4spsq\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.665129 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp7cw\" (UniqueName: \"kubernetes.io/projected/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-kube-api-access-zp7cw\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.665139 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4d76e759-050c-4e98-b79c-6eb25431c21e-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:34 crc kubenswrapper[4764]: I1001 16:26:34.665149 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:35 crc kubenswrapper[4764]: I1001 16:26:35.114320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" event={"ID":"b3673e54-d98b-45f9-a98f-8ccb4e65ccf9","Type":"ContainerDied","Data":"ece4ee6132ea7718de710accb260f283f0aff0de06de8b539e29eedfbd44e06c"} Oct 01 16:26:35 crc kubenswrapper[4764]: I1001 16:26:35.114388 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql" Oct 01 16:26:35 crc kubenswrapper[4764]: I1001 16:26:35.114398 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ece4ee6132ea7718de710accb260f283f0aff0de06de8b539e29eedfbd44e06c" Oct 01 16:26:35 crc kubenswrapper[4764]: I1001 16:26:35.118491 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" Oct 01 16:26:35 crc kubenswrapper[4764]: I1001 16:26:35.118554 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22" event={"ID":"4d76e759-050c-4e98-b79c-6eb25431c21e","Type":"ContainerDied","Data":"f2d2ebc72e91a5d63a4ffb1a18d3765c04bc8d3a646a552b3995a512eccdfdb2"} Oct 01 16:26:35 crc kubenswrapper[4764]: I1001 16:26:35.118585 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2d2ebc72e91a5d63a4ffb1a18d3765c04bc8d3a646a552b3995a512eccdfdb2" Oct 01 16:26:35 crc kubenswrapper[4764]: I1001 16:26:35.176210 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d76e759-050c-4e98-b79c-6eb25431c21e-util" (OuterVolumeSpecName: "util") pod "4d76e759-050c-4e98-b79c-6eb25431c21e" (UID: "4d76e759-050c-4e98-b79c-6eb25431c21e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:26:35 crc kubenswrapper[4764]: I1001 16:26:35.237851 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-util" (OuterVolumeSpecName: "util") pod "b3673e54-d98b-45f9-a98f-8ccb4e65ccf9" (UID: "b3673e54-d98b-45f9-a98f-8ccb4e65ccf9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:26:35 crc kubenswrapper[4764]: I1001 16:26:35.277778 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3673e54-d98b-45f9-a98f-8ccb4e65ccf9-util\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:35 crc kubenswrapper[4764]: I1001 16:26:35.277833 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4d76e759-050c-4e98-b79c-6eb25431c21e-util\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:35 crc kubenswrapper[4764]: I1001 16:26:35.427171 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vtch5"] Oct 01 16:26:36 crc kubenswrapper[4764]: I1001 16:26:36.130540 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vtch5" podUID="85bd5c30-3b37-486a-bc94-e0132899aa8d" containerName="registry-server" containerID="cri-o://86ae4b3d429093aba5e2b524484fda3c32b07cfd810b522fe0c845cf4740cf7f" gracePeriod=2 Oct 01 16:26:36 crc kubenswrapper[4764]: I1001 16:26:36.570742 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:36 crc kubenswrapper[4764]: I1001 16:26:36.707410 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd5c30-3b37-486a-bc94-e0132899aa8d-catalog-content\") pod \"85bd5c30-3b37-486a-bc94-e0132899aa8d\" (UID: \"85bd5c30-3b37-486a-bc94-e0132899aa8d\") " Oct 01 16:26:36 crc kubenswrapper[4764]: I1001 16:26:36.707608 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd5c30-3b37-486a-bc94-e0132899aa8d-utilities\") pod \"85bd5c30-3b37-486a-bc94-e0132899aa8d\" (UID: \"85bd5c30-3b37-486a-bc94-e0132899aa8d\") " Oct 01 16:26:36 crc kubenswrapper[4764]: I1001 16:26:36.707886 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snsqf\" (UniqueName: \"kubernetes.io/projected/85bd5c30-3b37-486a-bc94-e0132899aa8d-kube-api-access-snsqf\") pod \"85bd5c30-3b37-486a-bc94-e0132899aa8d\" (UID: \"85bd5c30-3b37-486a-bc94-e0132899aa8d\") " Oct 01 16:26:36 crc kubenswrapper[4764]: I1001 16:26:36.708439 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bd5c30-3b37-486a-bc94-e0132899aa8d-utilities" (OuterVolumeSpecName: "utilities") pod "85bd5c30-3b37-486a-bc94-e0132899aa8d" (UID: "85bd5c30-3b37-486a-bc94-e0132899aa8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:26:36 crc kubenswrapper[4764]: I1001 16:26:36.711443 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85bd5c30-3b37-486a-bc94-e0132899aa8d-kube-api-access-snsqf" (OuterVolumeSpecName: "kube-api-access-snsqf") pod "85bd5c30-3b37-486a-bc94-e0132899aa8d" (UID: "85bd5c30-3b37-486a-bc94-e0132899aa8d"). InnerVolumeSpecName "kube-api-access-snsqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:26:36 crc kubenswrapper[4764]: I1001 16:26:36.759671 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85bd5c30-3b37-486a-bc94-e0132899aa8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85bd5c30-3b37-486a-bc94-e0132899aa8d" (UID: "85bd5c30-3b37-486a-bc94-e0132899aa8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:26:36 crc kubenswrapper[4764]: I1001 16:26:36.811039 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85bd5c30-3b37-486a-bc94-e0132899aa8d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:36 crc kubenswrapper[4764]: I1001 16:26:36.811098 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snsqf\" (UniqueName: \"kubernetes.io/projected/85bd5c30-3b37-486a-bc94-e0132899aa8d-kube-api-access-snsqf\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:36 crc kubenswrapper[4764]: I1001 16:26:36.811114 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85bd5c30-3b37-486a-bc94-e0132899aa8d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.146578 4764 generic.go:334] "Generic (PLEG): container finished" podID="85bd5c30-3b37-486a-bc94-e0132899aa8d" containerID="86ae4b3d429093aba5e2b524484fda3c32b07cfd810b522fe0c845cf4740cf7f" exitCode=0 Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.146681 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vtch5" Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.146678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vtch5" event={"ID":"85bd5c30-3b37-486a-bc94-e0132899aa8d","Type":"ContainerDied","Data":"86ae4b3d429093aba5e2b524484fda3c32b07cfd810b522fe0c845cf4740cf7f"} Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.148178 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vtch5" event={"ID":"85bd5c30-3b37-486a-bc94-e0132899aa8d","Type":"ContainerDied","Data":"3b81b8291fa1597e14afcad3f870267e563bf7054bbf6216e8d3a96358837e31"} Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.148212 4764 scope.go:117] "RemoveContainer" containerID="86ae4b3d429093aba5e2b524484fda3c32b07cfd810b522fe0c845cf4740cf7f" Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.184367 4764 scope.go:117] "RemoveContainer" containerID="98eab8c49b1759837c2d73fc57c4204fd63be16a99d5dad15a0bf5a633e87733" Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.214916 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vtch5"] Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.224914 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vtch5"] Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.238908 4764 scope.go:117] "RemoveContainer" containerID="f43ef0f452496867cbc0d9daf70adee2678fb1d6a338a4014432558a08845309" Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.275126 4764 scope.go:117] "RemoveContainer" containerID="86ae4b3d429093aba5e2b524484fda3c32b07cfd810b522fe0c845cf4740cf7f" Oct 01 16:26:37 crc kubenswrapper[4764]: E1001 16:26:37.275684 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86ae4b3d429093aba5e2b524484fda3c32b07cfd810b522fe0c845cf4740cf7f\": container with ID starting with 86ae4b3d429093aba5e2b524484fda3c32b07cfd810b522fe0c845cf4740cf7f not found: ID does not exist" containerID="86ae4b3d429093aba5e2b524484fda3c32b07cfd810b522fe0c845cf4740cf7f" Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.275844 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ae4b3d429093aba5e2b524484fda3c32b07cfd810b522fe0c845cf4740cf7f"} err="failed to get container status \"86ae4b3d429093aba5e2b524484fda3c32b07cfd810b522fe0c845cf4740cf7f\": rpc error: code = NotFound desc = could not find container \"86ae4b3d429093aba5e2b524484fda3c32b07cfd810b522fe0c845cf4740cf7f\": container with ID starting with 86ae4b3d429093aba5e2b524484fda3c32b07cfd810b522fe0c845cf4740cf7f not found: ID does not exist" Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.276003 4764 scope.go:117] "RemoveContainer" containerID="98eab8c49b1759837c2d73fc57c4204fd63be16a99d5dad15a0bf5a633e87733" Oct 01 16:26:37 crc kubenswrapper[4764]: E1001 16:26:37.276543 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98eab8c49b1759837c2d73fc57c4204fd63be16a99d5dad15a0bf5a633e87733\": container with ID starting with 98eab8c49b1759837c2d73fc57c4204fd63be16a99d5dad15a0bf5a633e87733 not found: ID does not exist" containerID="98eab8c49b1759837c2d73fc57c4204fd63be16a99d5dad15a0bf5a633e87733" Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.276581 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98eab8c49b1759837c2d73fc57c4204fd63be16a99d5dad15a0bf5a633e87733"} err="failed to get container status \"98eab8c49b1759837c2d73fc57c4204fd63be16a99d5dad15a0bf5a633e87733\": rpc error: code = NotFound desc = could not find container \"98eab8c49b1759837c2d73fc57c4204fd63be16a99d5dad15a0bf5a633e87733\": container with ID starting with 98eab8c49b1759837c2d73fc57c4204fd63be16a99d5dad15a0bf5a633e87733 not found: ID does not exist" Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.276604 4764 scope.go:117] "RemoveContainer" containerID="f43ef0f452496867cbc0d9daf70adee2678fb1d6a338a4014432558a08845309" Oct 01 16:26:37 crc kubenswrapper[4764]: E1001 16:26:37.276893 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f43ef0f452496867cbc0d9daf70adee2678fb1d6a338a4014432558a08845309\": container with ID starting with f43ef0f452496867cbc0d9daf70adee2678fb1d6a338a4014432558a08845309 not found: ID does not exist" containerID="f43ef0f452496867cbc0d9daf70adee2678fb1d6a338a4014432558a08845309" Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.277005 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f43ef0f452496867cbc0d9daf70adee2678fb1d6a338a4014432558a08845309"} err="failed to get container status \"f43ef0f452496867cbc0d9daf70adee2678fb1d6a338a4014432558a08845309\": rpc error: code = NotFound desc = could not find container \"f43ef0f452496867cbc0d9daf70adee2678fb1d6a338a4014432558a08845309\": container with ID starting with f43ef0f452496867cbc0d9daf70adee2678fb1d6a338a4014432558a08845309 not found: ID does not exist" Oct 01 16:26:37 crc kubenswrapper[4764]: I1001 16:26:37.735106 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85bd5c30-3b37-486a-bc94-e0132899aa8d" path="/var/lib/kubelet/pods/85bd5c30-3b37-486a-bc94-e0132899aa8d/volumes" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.259723 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-lfzc4"] Oct 01 16:26:38 crc kubenswrapper[4764]: E1001 16:26:38.260186 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bd5c30-3b37-486a-bc94-e0132899aa8d" containerName="extract-utilities" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.260204 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bd5c30-3b37-486a-bc94-e0132899aa8d" containerName="extract-utilities" Oct 01 16:26:38 crc kubenswrapper[4764]: E1001 16:26:38.260217 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3673e54-d98b-45f9-a98f-8ccb4e65ccf9" containerName="util" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.260224 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3673e54-d98b-45f9-a98f-8ccb4e65ccf9" containerName="util" Oct 01 16:26:38 crc kubenswrapper[4764]: E1001 16:26:38.260241 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bd5c30-3b37-486a-bc94-e0132899aa8d" containerName="registry-server" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.260249 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bd5c30-3b37-486a-bc94-e0132899aa8d" containerName="registry-server" Oct 01 16:26:38 crc kubenswrapper[4764]: E1001 16:26:38.260268 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d76e759-050c-4e98-b79c-6eb25431c21e" containerName="util" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.260275 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d76e759-050c-4e98-b79c-6eb25431c21e" containerName="util" Oct 01 16:26:38 crc kubenswrapper[4764]: E1001 16:26:38.260293 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3673e54-d98b-45f9-a98f-8ccb4e65ccf9" containerName="pull" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.260299 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3673e54-d98b-45f9-a98f-8ccb4e65ccf9" containerName="pull" Oct 01 16:26:38 crc kubenswrapper[4764]: E1001 16:26:38.260308 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3673e54-d98b-45f9-a98f-8ccb4e65ccf9" containerName="extract" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.260316 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3673e54-d98b-45f9-a98f-8ccb4e65ccf9" containerName="extract" Oct 01 16:26:38 crc kubenswrapper[4764]: E1001 16:26:38.260333 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d76e759-050c-4e98-b79c-6eb25431c21e" containerName="pull" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.260340 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d76e759-050c-4e98-b79c-6eb25431c21e" containerName="pull" Oct 01 16:26:38 crc kubenswrapper[4764]: E1001 16:26:38.260353 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bd5c30-3b37-486a-bc94-e0132899aa8d" containerName="extract-content" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.260359 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bd5c30-3b37-486a-bc94-e0132899aa8d" containerName="extract-content" Oct 01 16:26:38 crc kubenswrapper[4764]: E1001 16:26:38.260369 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d76e759-050c-4e98-b79c-6eb25431c21e" containerName="extract" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.260376 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d76e759-050c-4e98-b79c-6eb25431c21e" containerName="extract" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.260600 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3673e54-d98b-45f9-a98f-8ccb4e65ccf9" containerName="extract" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.260619 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85bd5c30-3b37-486a-bc94-e0132899aa8d" containerName="registry-server" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.260631 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d76e759-050c-4e98-b79c-6eb25431c21e" containerName="extract" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.261413 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-lfzc4" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.267524 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-lfzc4"] Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.347729 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj5t4\" (UniqueName: \"kubernetes.io/projected/dd999924-e56c-45fa-8214-cec275174611-kube-api-access-bj5t4\") pod \"nmstate-operator-858ddd8f98-lfzc4\" (UID: \"dd999924-e56c-45fa-8214-cec275174611\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-lfzc4" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.449713 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj5t4\" (UniqueName: \"kubernetes.io/projected/dd999924-e56c-45fa-8214-cec275174611-kube-api-access-bj5t4\") pod \"nmstate-operator-858ddd8f98-lfzc4\" (UID: \"dd999924-e56c-45fa-8214-cec275174611\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-lfzc4" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.476258 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj5t4\" (UniqueName: \"kubernetes.io/projected/dd999924-e56c-45fa-8214-cec275174611-kube-api-access-bj5t4\") pod \"nmstate-operator-858ddd8f98-lfzc4\" (UID: \"dd999924-e56c-45fa-8214-cec275174611\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-lfzc4" Oct 01 16:26:38 crc kubenswrapper[4764]: I1001 16:26:38.582638 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-lfzc4" Oct 01 16:26:39 crc kubenswrapper[4764]: I1001 16:26:39.049479 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-lfzc4"] Oct 01 16:26:39 crc kubenswrapper[4764]: I1001 16:26:39.167145 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-lfzc4" event={"ID":"dd999924-e56c-45fa-8214-cec275174611","Type":"ContainerStarted","Data":"5dc35f6ad27cef9311dfabf2c43f6d88a30285caf7b5d9a945ffc821a4899874"} Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.242069 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-lfzc4" event={"ID":"dd999924-e56c-45fa-8214-cec275174611","Type":"ContainerStarted","Data":"fc270a557dac863b56e31eefd8044197e9c799c40bbb04adba5473cfafa14c73"} Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.264835 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-lfzc4" podStartSLOduration=1.727751827 podStartE2EDuration="4.264814649s" podCreationTimestamp="2025-10-01 16:26:38 +0000 UTC" firstStartedPulling="2025-10-01 16:26:39.053521113 +0000 UTC m=+1462.053167948" lastFinishedPulling="2025-10-01 16:26:41.590583935 +0000 UTC m=+1464.590230770" observedRunningTime="2025-10-01 16:26:42.263344903 +0000 UTC m=+1465.262991738" watchObservedRunningTime="2025-10-01 16:26:42.264814649 +0000 UTC m=+1465.264461484" Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.289828 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d"] Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.290044 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d" podUID="ca151bec-53be-465f-a65d-7cd62254e50f" containerName="nmstate-operator" containerID="cri-o://9231daaac6e35c9d0d1d9db442a481d038778a91823211ee6b01640e141ccf51" gracePeriod=30 Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.333308 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj"] Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.333721 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" podUID="77091922-9776-42fc-af29-381f72eb28c6" containerName="nmstate-metrics" containerID="cri-o://826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3" gracePeriod=30 Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.333875 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" podUID="77091922-9776-42fc-af29-381f72eb28c6" containerName="kube-rbac-proxy" containerID="cri-o://512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178" gracePeriod=30 Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.355520 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj"] Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.355984 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" podUID="2ef4a5fe-56b8-4b85-b1e6-831f98dd880b" containerName="nmstate-webhook" containerID="cri-o://8a51083d03ababbcfc0ad24c5bb6a5b18eed0300b217cc8b02e91ff1c2f5ae3d" gracePeriod=30 Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.374211 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-nmstate/nmstate-handler-vkk2s"] Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.375222 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-nmstate/nmstate-handler-vkk2s" podUID="8e5be6aa-6ed0-460a-81a8-1681016c3542" containerName="nmstate-handler" containerID="cri-o://aa9d4ac6d0cf9649f93c4fac36f8b7b6a8c6317b78b6c666d9b25cf0c8c87244" gracePeriod=30 Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.493171 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns"] Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.494743 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.503037 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns"] Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.626698 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5b55cc57-324e-4f77-aa9f-d655abd399b4-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-t66ns\" (UID: \"5b55cc57-324e-4f77-aa9f-d655abd399b4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.626937 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxrkj\" (UniqueName: \"kubernetes.io/projected/5b55cc57-324e-4f77-aa9f-d655abd399b4-kube-api-access-cxrkj\") pod \"nmstate-console-plugin-6b874cbd85-t66ns\" (UID: \"5b55cc57-324e-4f77-aa9f-d655abd399b4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.627091 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b55cc57-324e-4f77-aa9f-d655abd399b4-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-t66ns\" (UID: \"5b55cc57-324e-4f77-aa9f-d655abd399b4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.731481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxrkj\" (UniqueName: \"kubernetes.io/projected/5b55cc57-324e-4f77-aa9f-d655abd399b4-kube-api-access-cxrkj\") pod \"nmstate-console-plugin-6b874cbd85-t66ns\" (UID: \"5b55cc57-324e-4f77-aa9f-d655abd399b4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.731575 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b55cc57-324e-4f77-aa9f-d655abd399b4-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-t66ns\" (UID: \"5b55cc57-324e-4f77-aa9f-d655abd399b4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.731667 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5b55cc57-324e-4f77-aa9f-d655abd399b4-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-t66ns\" (UID: \"5b55cc57-324e-4f77-aa9f-d655abd399b4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.735999 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5b55cc57-324e-4f77-aa9f-d655abd399b4-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-t66ns\" (UID: \"5b55cc57-324e-4f77-aa9f-d655abd399b4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.737442 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b55cc57-324e-4f77-aa9f-d655abd399b4-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-t66ns\" (UID: \"5b55cc57-324e-4f77-aa9f-d655abd399b4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" Oct 01 16:26:42 crc kubenswrapper[4764]: I1001 16:26:42.763568 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxrkj\" (UniqueName: \"kubernetes.io/projected/5b55cc57-324e-4f77-aa9f-d655abd399b4-kube-api-access-cxrkj\") pod \"nmstate-console-plugin-6b874cbd85-t66ns\" (UID: \"5b55cc57-324e-4f77-aa9f-d655abd399b4\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.029198 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.069603 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.082406 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.082406 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.135661 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-snhkz"] Oct 01 16:26:43 crc kubenswrapper[4764]: E1001 16:26:43.136055 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef4a5fe-56b8-4b85-b1e6-831f98dd880b" containerName="nmstate-webhook" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.136071 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef4a5fe-56b8-4b85-b1e6-831f98dd880b" containerName="nmstate-webhook" Oct 01 16:26:43 crc kubenswrapper[4764]: E1001 16:26:43.136087 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5be6aa-6ed0-460a-81a8-1681016c3542" containerName="nmstate-handler" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.136095 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5be6aa-6ed0-460a-81a8-1681016c3542" containerName="nmstate-handler" Oct 01 16:26:43 crc kubenswrapper[4764]: E1001 16:26:43.136110 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77091922-9776-42fc-af29-381f72eb28c6" containerName="nmstate-metrics" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.136116 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="77091922-9776-42fc-af29-381f72eb28c6" containerName="nmstate-metrics" Oct 01 16:26:43 crc kubenswrapper[4764]: E1001 16:26:43.136143 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77091922-9776-42fc-af29-381f72eb28c6" containerName="kube-rbac-proxy" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.136148 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="77091922-9776-42fc-af29-381f72eb28c6" containerName="kube-rbac-proxy" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.136313 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef4a5fe-56b8-4b85-b1e6-831f98dd880b" containerName="nmstate-webhook" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.136334 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="77091922-9776-42fc-af29-381f72eb28c6" containerName="kube-rbac-proxy" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.136347 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="77091922-9776-42fc-af29-381f72eb28c6" containerName="nmstate-metrics" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.136358 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e5be6aa-6ed0-460a-81a8-1681016c3542" containerName="nmstate-handler" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.136936 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.137194 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2d8g\" (UniqueName: \"kubernetes.io/projected/8e5be6aa-6ed0-460a-81a8-1681016c3542-kube-api-access-z2d8g\") pod \"8e5be6aa-6ed0-460a-81a8-1681016c3542\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.137302 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-ovs-socket\") pod \"8e5be6aa-6ed0-460a-81a8-1681016c3542\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.137342 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-tls-key-pair\") pod \"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b\" (UID: \"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b\") " Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.137379 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-ovs-socket" (OuterVolumeSpecName: "ovs-socket") pod "8e5be6aa-6ed0-460a-81a8-1681016c3542" (UID: "8e5be6aa-6ed0-460a-81a8-1681016c3542"). InnerVolumeSpecName "ovs-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.137520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-nmstate-lock\") pod \"8e5be6aa-6ed0-460a-81a8-1681016c3542\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.137552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qln6x\" (UniqueName: \"kubernetes.io/projected/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-kube-api-access-qln6x\") pod \"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b\" (UID: \"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b\") " Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.137581 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dnjf\" (UniqueName: \"kubernetes.io/projected/77091922-9776-42fc-af29-381f72eb28c6-kube-api-access-2dnjf\") pod \"77091922-9776-42fc-af29-381f72eb28c6\" (UID: \"77091922-9776-42fc-af29-381f72eb28c6\") " Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.137640 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-dbus-socket\") pod \"8e5be6aa-6ed0-460a-81a8-1681016c3542\" (UID: \"8e5be6aa-6ed0-460a-81a8-1681016c3542\") " Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.138851 4764 reconciler_common.go:293] "Volume detached for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-ovs-socket\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.139466 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-dbus-socket" (OuterVolumeSpecName: "dbus-socket") pod "8e5be6aa-6ed0-460a-81a8-1681016c3542" (UID: "8e5be6aa-6ed0-460a-81a8-1681016c3542"). InnerVolumeSpecName "dbus-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.139497 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-nmstate-lock" (OuterVolumeSpecName: "nmstate-lock") pod "8e5be6aa-6ed0-460a-81a8-1681016c3542" (UID: "8e5be6aa-6ed0-460a-81a8-1681016c3542"). InnerVolumeSpecName "nmstate-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.141678 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-kube-api-access-qln6x" (OuterVolumeSpecName: "kube-api-access-qln6x") pod "2ef4a5fe-56b8-4b85-b1e6-831f98dd880b" (UID: "2ef4a5fe-56b8-4b85-b1e6-831f98dd880b"). InnerVolumeSpecName "kube-api-access-qln6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.142672 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5be6aa-6ed0-460a-81a8-1681016c3542-kube-api-access-z2d8g" (OuterVolumeSpecName: "kube-api-access-z2d8g") pod "8e5be6aa-6ed0-460a-81a8-1681016c3542" (UID: "8e5be6aa-6ed0-460a-81a8-1681016c3542"). InnerVolumeSpecName "kube-api-access-z2d8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.144738 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77091922-9776-42fc-af29-381f72eb28c6-kube-api-access-2dnjf" (OuterVolumeSpecName: "kube-api-access-2dnjf") pod "77091922-9776-42fc-af29-381f72eb28c6" (UID: "77091922-9776-42fc-af29-381f72eb28c6"). InnerVolumeSpecName "kube-api-access-2dnjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.149205 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-tls-key-pair" (OuterVolumeSpecName: "tls-key-pair") pod "2ef4a5fe-56b8-4b85-b1e6-831f98dd880b" (UID: "2ef4a5fe-56b8-4b85-b1e6-831f98dd880b"). InnerVolumeSpecName "tls-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.234972 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-dg2fb"] Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.236580 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dg2fb" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.240556 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b82401c8-962a-451a-954e-2f603fe91129-dbus-socket\") pod \"nmstate-handler-snhkz\" (UID: \"b82401c8-962a-451a-954e-2f603fe91129\") " pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.240729 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b82401c8-962a-451a-954e-2f603fe91129-ovs-socket\") pod \"nmstate-handler-snhkz\" (UID: \"b82401c8-962a-451a-954e-2f603fe91129\") " pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.240782 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b82401c8-962a-451a-954e-2f603fe91129-nmstate-lock\") pod \"nmstate-handler-snhkz\" (UID: \"b82401c8-962a-451a-954e-2f603fe91129\") " pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.240816 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr9jh\" (UniqueName: \"kubernetes.io/projected/b82401c8-962a-451a-954e-2f603fe91129-kube-api-access-gr9jh\") pod \"nmstate-handler-snhkz\" (UID: \"b82401c8-962a-451a-954e-2f603fe91129\") " pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.240914 4764 reconciler_common.go:293] "Volume detached for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-dbus-socket\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.240937 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2d8g\" (UniqueName: \"kubernetes.io/projected/8e5be6aa-6ed0-460a-81a8-1681016c3542-kube-api-access-z2d8g\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.240951 4764 reconciler_common.go:293] "Volume detached for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-tls-key-pair\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.240963 4764 reconciler_common.go:293] "Volume detached for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8e5be6aa-6ed0-460a-81a8-1681016c3542-nmstate-lock\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.240975 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qln6x\" (UniqueName: \"kubernetes.io/projected/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b-kube-api-access-qln6x\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.240987 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dnjf\" (UniqueName: \"kubernetes.io/projected/77091922-9776-42fc-af29-381f72eb28c6-kube-api-access-2dnjf\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.250591 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-dg2fb"] Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.257921 4764 generic.go:334] "Generic (PLEG): container finished" podID="8e5be6aa-6ed0-460a-81a8-1681016c3542" containerID="aa9d4ac6d0cf9649f93c4fac36f8b7b6a8c6317b78b6c666d9b25cf0c8c87244" exitCode=0 Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.258005 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vkk2s" event={"ID":"8e5be6aa-6ed0-460a-81a8-1681016c3542","Type":"ContainerDied","Data":"aa9d4ac6d0cf9649f93c4fac36f8b7b6a8c6317b78b6c666d9b25cf0c8c87244"} Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.258033 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vkk2s" event={"ID":"8e5be6aa-6ed0-460a-81a8-1681016c3542","Type":"ContainerDied","Data":"7f8a146c2ea6127875161e5ff6e9621fba9102d2fa95b827db62ba1a1ec5e2ef"} Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.258052 4764 scope.go:117] "RemoveContainer" containerID="aa9d4ac6d0cf9649f93c4fac36f8b7b6a8c6317b78b6c666d9b25cf0c8c87244" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.258165 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vkk2s" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.262983 4764 generic.go:334] "Generic (PLEG): container finished" podID="2ef4a5fe-56b8-4b85-b1e6-831f98dd880b" containerID="8a51083d03ababbcfc0ad24c5bb6a5b18eed0300b217cc8b02e91ff1c2f5ae3d" exitCode=0 Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.263040 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" event={"ID":"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b","Type":"ContainerDied","Data":"8a51083d03ababbcfc0ad24c5bb6a5b18eed0300b217cc8b02e91ff1c2f5ae3d"} Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.263134 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" event={"ID":"2ef4a5fe-56b8-4b85-b1e6-831f98dd880b","Type":"ContainerDied","Data":"fd5c87ffe45d8d9fc9a2a8cbfcc7ef9fbe947ad81ec5fbad759ddc6e2356d8b5"} Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.263178 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.265194 4764 generic.go:334] "Generic (PLEG): container finished" podID="ca151bec-53be-465f-a65d-7cd62254e50f" containerID="9231daaac6e35c9d0d1d9db442a481d038778a91823211ee6b01640e141ccf51" exitCode=0 Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.265263 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d" event={"ID":"ca151bec-53be-465f-a65d-7cd62254e50f","Type":"ContainerDied","Data":"9231daaac6e35c9d0d1d9db442a481d038778a91823211ee6b01640e141ccf51"} Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.272868 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.272637 4764 generic.go:334] "Generic (PLEG): container finished" podID="77091922-9776-42fc-af29-381f72eb28c6" containerID="512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178" exitCode=0 Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.273228 4764 generic.go:334] "Generic (PLEG): container finished" podID="77091922-9776-42fc-af29-381f72eb28c6" containerID="826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3" exitCode=0 Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.273312 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" event={"ID":"77091922-9776-42fc-af29-381f72eb28c6","Type":"ContainerDied","Data":"512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178"} Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.273367 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" event={"ID":"77091922-9776-42fc-af29-381f72eb28c6","Type":"ContainerDied","Data":"826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3"} Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.273385 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj" event={"ID":"77091922-9776-42fc-af29-381f72eb28c6","Type":"ContainerDied","Data":"b2bbbf432302ec59fc84b5aa0ff9da03fd9243aebcd12cc851aeaac0a801d081"} Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.329862 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-nmstate/nmstate-handler-vkk2s"] Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.346291 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b82401c8-962a-451a-954e-2f603fe91129-ovs-socket\") pod \"nmstate-handler-snhkz\" (UID: \"b82401c8-962a-451a-954e-2f603fe91129\") " pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.346349 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nls7w\" (UniqueName: \"kubernetes.io/projected/e8f3e872-b4e9-4b58-95aa-f63812824933-kube-api-access-nls7w\") pod \"nmstate-metrics-fdff9cb8d-dg2fb\" (UID: \"e8f3e872-b4e9-4b58-95aa-f63812824933\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dg2fb" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.346395 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b82401c8-962a-451a-954e-2f603fe91129-nmstate-lock\") pod \"nmstate-handler-snhkz\" (UID: \"b82401c8-962a-451a-954e-2f603fe91129\") " pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.346419 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr9jh\" (UniqueName: \"kubernetes.io/projected/b82401c8-962a-451a-954e-2f603fe91129-kube-api-access-gr9jh\") pod \"nmstate-handler-snhkz\" (UID: \"b82401c8-962a-451a-954e-2f603fe91129\") " pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.346485 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b82401c8-962a-451a-954e-2f603fe91129-dbus-socket\") pod \"nmstate-handler-snhkz\" (UID: \"b82401c8-962a-451a-954e-2f603fe91129\") " pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.346822 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b82401c8-962a-451a-954e-2f603fe91129-dbus-socket\") pod \"nmstate-handler-snhkz\" (UID: \"b82401c8-962a-451a-954e-2f603fe91129\") " pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.346866 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b82401c8-962a-451a-954e-2f603fe91129-ovs-socket\") pod \"nmstate-handler-snhkz\" (UID: \"b82401c8-962a-451a-954e-2f603fe91129\") " pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.347825 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b82401c8-962a-451a-954e-2f603fe91129-nmstate-lock\") pod \"nmstate-handler-snhkz\" (UID: \"b82401c8-962a-451a-954e-2f603fe91129\") " pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.359780 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-nmstate/nmstate-handler-vkk2s"] Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.370931 4764 scope.go:117] "RemoveContainer" containerID="aa9d4ac6d0cf9649f93c4fac36f8b7b6a8c6317b78b6c666d9b25cf0c8c87244" Oct 01 16:26:43 crc kubenswrapper[4764]: E1001 16:26:43.373195 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa9d4ac6d0cf9649f93c4fac36f8b7b6a8c6317b78b6c666d9b25cf0c8c87244\": container with ID starting with aa9d4ac6d0cf9649f93c4fac36f8b7b6a8c6317b78b6c666d9b25cf0c8c87244 not found: ID does not exist" containerID="aa9d4ac6d0cf9649f93c4fac36f8b7b6a8c6317b78b6c666d9b25cf0c8c87244" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.373229 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa9d4ac6d0cf9649f93c4fac36f8b7b6a8c6317b78b6c666d9b25cf0c8c87244"} err="failed to get container status \"aa9d4ac6d0cf9649f93c4fac36f8b7b6a8c6317b78b6c666d9b25cf0c8c87244\": rpc error: code = NotFound desc = could not find container \"aa9d4ac6d0cf9649f93c4fac36f8b7b6a8c6317b78b6c666d9b25cf0c8c87244\": container with ID starting with aa9d4ac6d0cf9649f93c4fac36f8b7b6a8c6317b78b6c666d9b25cf0c8c87244 not found: ID does not exist" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.373249 4764 scope.go:117] "RemoveContainer" containerID="8a51083d03ababbcfc0ad24c5bb6a5b18eed0300b217cc8b02e91ff1c2f5ae3d" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.374560 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr9jh\" (UniqueName: \"kubernetes.io/projected/b82401c8-962a-451a-954e-2f603fe91129-kube-api-access-gr9jh\") pod \"nmstate-handler-snhkz\" (UID: \"b82401c8-962a-451a-954e-2f603fe91129\") " pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.382641 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm"] Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.383808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.388832 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.418516 4764 scope.go:117] "RemoveContainer" containerID="8a51083d03ababbcfc0ad24c5bb6a5b18eed0300b217cc8b02e91ff1c2f5ae3d" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.418876 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj"] Oct 01 16:26:43 crc kubenswrapper[4764]: E1001 16:26:43.418953 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a51083d03ababbcfc0ad24c5bb6a5b18eed0300b217cc8b02e91ff1c2f5ae3d\": container with ID starting with 8a51083d03ababbcfc0ad24c5bb6a5b18eed0300b217cc8b02e91ff1c2f5ae3d not found: ID does not exist" containerID="8a51083d03ababbcfc0ad24c5bb6a5b18eed0300b217cc8b02e91ff1c2f5ae3d" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.419014 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a51083d03ababbcfc0ad24c5bb6a5b18eed0300b217cc8b02e91ff1c2f5ae3d"} err="failed to get container status \"8a51083d03ababbcfc0ad24c5bb6a5b18eed0300b217cc8b02e91ff1c2f5ae3d\": rpc error: code = NotFound desc = could not find container \"8a51083d03ababbcfc0ad24c5bb6a5b18eed0300b217cc8b02e91ff1c2f5ae3d\": container with ID starting with 8a51083d03ababbcfc0ad24c5bb6a5b18eed0300b217cc8b02e91ff1c2f5ae3d not found: ID does not exist" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.419039 4764 scope.go:117] "RemoveContainer" containerID="512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.449967 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlpdk\" (UniqueName: \"kubernetes.io/projected/2f2de87a-678a-4de3-8fec-9b695f301201-kube-api-access-mlpdk\") pod \"nmstate-webhook-6cdbc54649-bmnmm\" (UID: \"2f2de87a-678a-4de3-8fec-9b695f301201\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.450301 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nls7w\" (UniqueName: \"kubernetes.io/projected/e8f3e872-b4e9-4b58-95aa-f63812824933-kube-api-access-nls7w\") pod \"nmstate-metrics-fdff9cb8d-dg2fb\" (UID: \"e8f3e872-b4e9-4b58-95aa-f63812824933\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dg2fb" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.450486 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2f2de87a-678a-4de3-8fec-9b695f301201-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-bmnmm\" (UID: \"2f2de87a-678a-4de3-8fec-9b695f301201\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.450968 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.458371 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm"] Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.459692 4764 scope.go:117] "RemoveContainer" containerID="826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.476913 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nls7w\" (UniqueName: \"kubernetes.io/projected/e8f3e872-b4e9-4b58-95aa-f63812824933-kube-api-access-nls7w\") pod \"nmstate-metrics-fdff9cb8d-dg2fb\" (UID: \"e8f3e872-b4e9-4b58-95aa-f63812824933\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dg2fb" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.478530 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.483643 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-ljlwj"] Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.489961 4764 scope.go:117] "RemoveContainer" containerID="512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178" Oct 01 16:26:43 crc kubenswrapper[4764]: E1001 16:26:43.490335 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178\": container with ID starting with 512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178 not found: ID does not exist" containerID="512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.490375 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178"} err="failed to get container status \"512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178\": rpc error: code = NotFound desc = could not find container \"512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178\": container with ID starting with 512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178 not found: ID does not exist" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.490403 4764 scope.go:117] "RemoveContainer" containerID="826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3" Oct 01 16:26:43 crc kubenswrapper[4764]: E1001 16:26:43.490773 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3\": container with ID starting with 826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3 not found: ID does not exist" containerID="826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.490801 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3"} err="failed to get container status \"826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3\": rpc error: code = NotFound desc = could not find container \"826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3\": container with ID starting with 826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3 not found: ID does not exist" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.490822 4764 scope.go:117] "RemoveContainer" containerID="512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.490928 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj"] Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.491083 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178"} err="failed to get container status \"512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178\": rpc error: code = NotFound desc = could not find container \"512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178\": container with ID starting with 512dbf704b9431dd49f50a38cd188e975f5309fb5beb8a9b1374b1d435c97178 not found: ID does not exist" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.491264 4764 scope.go:117] "RemoveContainer" containerID="826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.491803 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3"} err="failed to get container status \"826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3\": rpc error: code = NotFound desc = could not find container \"826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3\": container with ID starting with 826c41e17cbd6c50efac1e47ad3aaea60b3248954a1cc540a31f3791c77008e3 not found: ID does not exist" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.498627 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-8r7dj"] Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.552016 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xszmw\" (UniqueName: \"kubernetes.io/projected/ca151bec-53be-465f-a65d-7cd62254e50f-kube-api-access-xszmw\") pod \"ca151bec-53be-465f-a65d-7cd62254e50f\" (UID: \"ca151bec-53be-465f-a65d-7cd62254e50f\") " Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.552660 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2f2de87a-678a-4de3-8fec-9b695f301201-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-bmnmm\" (UID: \"2f2de87a-678a-4de3-8fec-9b695f301201\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.552690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlpdk\" (UniqueName: \"kubernetes.io/projected/2f2de87a-678a-4de3-8fec-9b695f301201-kube-api-access-mlpdk\") pod \"nmstate-webhook-6cdbc54649-bmnmm\" (UID: \"2f2de87a-678a-4de3-8fec-9b695f301201\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.556362 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dg2fb" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.558097 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca151bec-53be-465f-a65d-7cd62254e50f-kube-api-access-xszmw" (OuterVolumeSpecName: "kube-api-access-xszmw") pod "ca151bec-53be-465f-a65d-7cd62254e50f" (UID: "ca151bec-53be-465f-a65d-7cd62254e50f"). InnerVolumeSpecName "kube-api-access-xszmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.560241 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2f2de87a-678a-4de3-8fec-9b695f301201-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-bmnmm\" (UID: \"2f2de87a-678a-4de3-8fec-9b695f301201\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.574908 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlpdk\" (UniqueName: \"kubernetes.io/projected/2f2de87a-678a-4de3-8fec-9b695f301201-kube-api-access-mlpdk\") pod \"nmstate-webhook-6cdbc54649-bmnmm\" (UID: \"2f2de87a-678a-4de3-8fec-9b695f301201\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.654438 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xszmw\" (UniqueName: \"kubernetes.io/projected/ca151bec-53be-465f-a65d-7cd62254e50f-kube-api-access-xszmw\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.711763 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.753906 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef4a5fe-56b8-4b85-b1e6-831f98dd880b" path="/var/lib/kubelet/pods/2ef4a5fe-56b8-4b85-b1e6-831f98dd880b/volumes" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.754740 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77091922-9776-42fc-af29-381f72eb28c6" path="/var/lib/kubelet/pods/77091922-9776-42fc-af29-381f72eb28c6/volumes" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.755297 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5be6aa-6ed0-460a-81a8-1681016c3542" path="/var/lib/kubelet/pods/8e5be6aa-6ed0-460a-81a8-1681016c3542/volumes" Oct 01 16:26:43 crc kubenswrapper[4764]: I1001 16:26:43.763171 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns"] Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.029581 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-dg2fb"] Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.199306 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm"] Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.251583 4764 scope.go:117] "RemoveContainer" containerID="8a97baa95a639c07f0ba933522b09bece922fe70ae6860602ebbb05b5d8d9b6c" Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.274380 4764 scope.go:117] "RemoveContainer" containerID="4711db450b745531ad6b2458236fdd08c7c1146b120434cdae64a8b81ced7236" Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.286455 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm" event={"ID":"2f2de87a-678a-4de3-8fec-9b695f301201","Type":"ContainerStarted","Data":"bd54b5d8bc613a293980ef45bb9d843e77567c60dbbe81663fbed3ed672fde5d"} Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.288766 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d" event={"ID":"ca151bec-53be-465f-a65d-7cd62254e50f","Type":"ContainerDied","Data":"56b7a34ca5766a3a0e0c96a94f826b0a5749512a645c694c4c708ee6f429c740"} Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.288794 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d" Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.288824 4764 scope.go:117] "RemoveContainer" containerID="9231daaac6e35c9d0d1d9db442a481d038778a91823211ee6b01640e141ccf51" Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.294913 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-snhkz" event={"ID":"b82401c8-962a-451a-954e-2f603fe91129","Type":"ContainerStarted","Data":"c895f268d9e6952efde1f153b8194d2ccba6052385e35c955331f1aefd66f6e4"} Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.299870 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dg2fb" event={"ID":"e8f3e872-b4e9-4b58-95aa-f63812824933","Type":"ContainerStarted","Data":"0746cf0041689fc6fe4dbb7f7ade44928a48e6d309869199385f57e88084f67b"} Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.301566 4764 scope.go:117] "RemoveContainer" containerID="9231daaac6e35c9d0d1d9db442a481d038778a91823211ee6b01640e141ccf51" Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.302194 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" event={"ID":"5b55cc57-324e-4f77-aa9f-d655abd399b4","Type":"ContainerStarted","Data":"6b06fbe5898f339da71e53245625ec509d48989eb29bc561e288dc57988c0389"} Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.318116 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d"] Oct 01 16:26:44 crc kubenswrapper[4764]: E1001 16:26:44.322562 4764 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_nmstate-operator_nmstate-operator-5d6f6cfd66-t979d_openshift-nmstate_ca151bec-53be-465f-a65d-7cd62254e50f_0 in pod sandbox 56b7a34ca5766a3a0e0c96a94f826b0a5749512a645c694c4c708ee6f429c740 from index: no such id: '9231daaac6e35c9d0d1d9db442a481d038778a91823211ee6b01640e141ccf51'" containerID="9231daaac6e35c9d0d1d9db442a481d038778a91823211ee6b01640e141ccf51" Oct 01 16:26:44 crc kubenswrapper[4764]: E1001 16:26:44.322612 4764 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = Unknown desc = failed to delete container k8s_nmstate-operator_nmstate-operator-5d6f6cfd66-t979d_openshift-nmstate_ca151bec-53be-465f-a65d-7cd62254e50f_0 in pod sandbox 56b7a34ca5766a3a0e0c96a94f826b0a5749512a645c694c4c708ee6f429c740 from index: no such id: '9231daaac6e35c9d0d1d9db442a481d038778a91823211ee6b01640e141ccf51'" containerID="9231daaac6e35c9d0d1d9db442a481d038778a91823211ee6b01640e141ccf51" Oct 01 16:26:44 crc kubenswrapper[4764]: I1001 16:26:44.330680 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-t979d"] Oct 01 16:26:45 crc kubenswrapper[4764]: I1001 16:26:45.735861 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca151bec-53be-465f-a65d-7cd62254e50f" path="/var/lib/kubelet/pods/ca151bec-53be-465f-a65d-7cd62254e50f/volumes" Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.403630 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm" event={"ID":"2f2de87a-678a-4de3-8fec-9b695f301201","Type":"ContainerStarted","Data":"c82e536c3251dde761aa3de6d131291f2d05afbb87be1eb05eed3574ac64187b"} Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.404193 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm" Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.405993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" event={"ID":"5b55cc57-324e-4f77-aa9f-d655abd399b4","Type":"ContainerStarted","Data":"afa613e7e3bebe64db08b2daa5fb3d83976193552cdc0a78eb174471d270e550"} Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.408950 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-snhkz" event={"ID":"b82401c8-962a-451a-954e-2f603fe91129","Type":"ContainerStarted","Data":"da57f10217048312750da3d960f4cd262e64a0cc9d4a50c1aee6d5b6a3570d1a"} Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.409567 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.411431 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dg2fb" event={"ID":"e8f3e872-b4e9-4b58-95aa-f63812824933","Type":"ContainerStarted","Data":"0eb9b1f431b6a2e87cfbc45b8831d9b5fae71a51f4b2c585dde683055e998497"} Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.428389 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm" podStartSLOduration=2.511264007 podStartE2EDuration="8.428371259s" podCreationTimestamp="2025-10-01 16:26:43 +0000 UTC" firstStartedPulling="2025-10-01 16:26:44.202409068 +0000 UTC m=+1467.202055903" lastFinishedPulling="2025-10-01 16:26:50.11951632 +0000 UTC m=+1473.119163155" observedRunningTime="2025-10-01 16:26:51.421197903 +0000 UTC m=+1474.420844738" watchObservedRunningTime="2025-10-01 16:26:51.428371259 +0000 UTC m=+1474.428018094" Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.469415 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-snhkz" podStartSLOduration=1.9700816209999998 podStartE2EDuration="8.469391043s" podCreationTimestamp="2025-10-01 16:26:43 +0000 UTC" firstStartedPulling="2025-10-01 16:26:43.529211249 +0000 UTC m=+1466.528858074" lastFinishedPulling="2025-10-01 16:26:50.028520661 +0000 UTC m=+1473.028167496" observedRunningTime="2025-10-01 16:26:51.444821452 +0000 UTC m=+1474.444468287" watchObservedRunningTime="2025-10-01 16:26:51.469391043 +0000 UTC m=+1474.469037898" Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.496883 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-t66ns" podStartSLOduration=3.266572583 podStartE2EDuration="9.496862726s" podCreationTimestamp="2025-10-01 16:26:42 +0000 UTC" firstStartedPulling="2025-10-01 16:26:43.767330241 +0000 UTC m=+1466.766977076" lastFinishedPulling="2025-10-01 16:26:49.997620384 +0000 UTC m=+1472.997267219" observedRunningTime="2025-10-01 16:26:51.467537258 +0000 UTC m=+1474.467184103" watchObservedRunningTime="2025-10-01 16:26:51.496862726 +0000 UTC m=+1474.496509561" Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.515767 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n"] Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.515967 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" podUID="d4957cda-9179-4505-ac94-adaf24c337e8" containerName="nmstate-console-plugin" containerID="cri-o://ab5f25d27a4b6e43584b8e67bcf899b03661bf6fe95740def633d94f2ccab25d" gracePeriod=30 Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.914530 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.914874 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.914930 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.915728 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bf2e42740725b9d54c8d60efb2a207718601c4c6231f1e898fc274c1b294773"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:26:51 crc kubenswrapper[4764]: I1001 16:26:51.915791 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://4bf2e42740725b9d54c8d60efb2a207718601c4c6231f1e898fc274c1b294773" gracePeriod=600 Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.106412 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.254695 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptp5c\" (UniqueName: \"kubernetes.io/projected/d4957cda-9179-4505-ac94-adaf24c337e8-kube-api-access-ptp5c\") pod \"d4957cda-9179-4505-ac94-adaf24c337e8\" (UID: \"d4957cda-9179-4505-ac94-adaf24c337e8\") " Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.254814 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d4957cda-9179-4505-ac94-adaf24c337e8-nginx-conf\") pod \"d4957cda-9179-4505-ac94-adaf24c337e8\" (UID: \"d4957cda-9179-4505-ac94-adaf24c337e8\") " Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.254853 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4957cda-9179-4505-ac94-adaf24c337e8-plugin-serving-cert\") pod \"d4957cda-9179-4505-ac94-adaf24c337e8\" (UID: \"d4957cda-9179-4505-ac94-adaf24c337e8\") " Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.261422 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4957cda-9179-4505-ac94-adaf24c337e8-plugin-serving-cert" (OuterVolumeSpecName: "plugin-serving-cert") pod "d4957cda-9179-4505-ac94-adaf24c337e8" (UID: "d4957cda-9179-4505-ac94-adaf24c337e8"). InnerVolumeSpecName "plugin-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.262471 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4957cda-9179-4505-ac94-adaf24c337e8-kube-api-access-ptp5c" (OuterVolumeSpecName: "kube-api-access-ptp5c") pod "d4957cda-9179-4505-ac94-adaf24c337e8" (UID: "d4957cda-9179-4505-ac94-adaf24c337e8"). InnerVolumeSpecName "kube-api-access-ptp5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.282083 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4957cda-9179-4505-ac94-adaf24c337e8-nginx-conf" (OuterVolumeSpecName: "nginx-conf") pod "d4957cda-9179-4505-ac94-adaf24c337e8" (UID: "d4957cda-9179-4505-ac94-adaf24c337e8"). InnerVolumeSpecName "nginx-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.357084 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptp5c\" (UniqueName: \"kubernetes.io/projected/d4957cda-9179-4505-ac94-adaf24c337e8-kube-api-access-ptp5c\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.357125 4764 reconciler_common.go:293] "Volume detached for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d4957cda-9179-4505-ac94-adaf24c337e8-nginx-conf\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.357141 4764 reconciler_common.go:293] "Volume detached for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d4957cda-9179-4505-ac94-adaf24c337e8-plugin-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.421390 4764 generic.go:334] "Generic (PLEG): container finished" podID="d4957cda-9179-4505-ac94-adaf24c337e8" containerID="ab5f25d27a4b6e43584b8e67bcf899b03661bf6fe95740def633d94f2ccab25d" exitCode=0 Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.421466 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" event={"ID":"d4957cda-9179-4505-ac94-adaf24c337e8","Type":"ContainerDied","Data":"ab5f25d27a4b6e43584b8e67bcf899b03661bf6fe95740def633d94f2ccab25d"} Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.421473 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.421492 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n" event={"ID":"d4957cda-9179-4505-ac94-adaf24c337e8","Type":"ContainerDied","Data":"945a117d3e967cb799cf7f2815cb42a512407ab0f236bf6c89d83bf3e0bdc511"} Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.421509 4764 scope.go:117] "RemoveContainer" containerID="ab5f25d27a4b6e43584b8e67bcf899b03661bf6fe95740def633d94f2ccab25d" Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.430597 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="4bf2e42740725b9d54c8d60efb2a207718601c4c6231f1e898fc274c1b294773" exitCode=0 Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.431562 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"4bf2e42740725b9d54c8d60efb2a207718601c4c6231f1e898fc274c1b294773"} Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.431593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176"} Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.484470 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n"] Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.485363 4764 scope.go:117] "RemoveContainer" containerID="ab5f25d27a4b6e43584b8e67bcf899b03661bf6fe95740def633d94f2ccab25d" Oct 01 16:26:52 crc kubenswrapper[4764]: E1001 16:26:52.485856 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab5f25d27a4b6e43584b8e67bcf899b03661bf6fe95740def633d94f2ccab25d\": container with ID starting with ab5f25d27a4b6e43584b8e67bcf899b03661bf6fe95740def633d94f2ccab25d not found: ID does not exist" containerID="ab5f25d27a4b6e43584b8e67bcf899b03661bf6fe95740def633d94f2ccab25d" Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.485903 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab5f25d27a4b6e43584b8e67bcf899b03661bf6fe95740def633d94f2ccab25d"} err="failed to get container status \"ab5f25d27a4b6e43584b8e67bcf899b03661bf6fe95740def633d94f2ccab25d\": rpc error: code = NotFound desc = could not find container \"ab5f25d27a4b6e43584b8e67bcf899b03661bf6fe95740def633d94f2ccab25d\": container with ID starting with ab5f25d27a4b6e43584b8e67bcf899b03661bf6fe95740def633d94f2ccab25d not found: ID does not exist" Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.485931 4764 scope.go:117] "RemoveContainer" containerID="98c11f42deb2a855802db6e539c07b78ed64042cc307603fe80868f20ffd6d4f" Oct 01 16:26:52 crc kubenswrapper[4764]: I1001 16:26:52.493664 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-f948n"] Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.660480 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x"] Oct 01 16:26:53 crc kubenswrapper[4764]: E1001 16:26:53.668760 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4957cda-9179-4505-ac94-adaf24c337e8" containerName="nmstate-console-plugin" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.668853 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4957cda-9179-4505-ac94-adaf24c337e8" containerName="nmstate-console-plugin" Oct 01 16:26:53 crc kubenswrapper[4764]: E1001 16:26:53.668943 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca151bec-53be-465f-a65d-7cd62254e50f" containerName="nmstate-operator" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.669011 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca151bec-53be-465f-a65d-7cd62254e50f" containerName="nmstate-operator" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.669321 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4957cda-9179-4505-ac94-adaf24c337e8" containerName="nmstate-console-plugin" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.669430 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca151bec-53be-465f-a65d-7cd62254e50f" containerName="nmstate-operator" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.670305 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.706487 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x"] Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.759340 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4957cda-9179-4505-ac94-adaf24c337e8" path="/var/lib/kubelet/pods/d4957cda-9179-4505-ac94-adaf24c337e8/volumes" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.787787 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08dc3cf9-a8da-48f8-bb23-77eb6740d676-apiservice-cert\") pod \"metallb-operator-controller-manager-56b6c48bf8-msr9x\" (UID: \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\") " pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.787830 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qlv6\" (UniqueName: \"kubernetes.io/projected/08dc3cf9-a8da-48f8-bb23-77eb6740d676-kube-api-access-9qlv6\") pod \"metallb-operator-controller-manager-56b6c48bf8-msr9x\" (UID: \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\") " pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.787853 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08dc3cf9-a8da-48f8-bb23-77eb6740d676-webhook-cert\") pod \"metallb-operator-controller-manager-56b6c48bf8-msr9x\" (UID: \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\") " pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.895302 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08dc3cf9-a8da-48f8-bb23-77eb6740d676-apiservice-cert\") pod \"metallb-operator-controller-manager-56b6c48bf8-msr9x\" (UID: \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\") " pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.895387 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qlv6\" (UniqueName: \"kubernetes.io/projected/08dc3cf9-a8da-48f8-bb23-77eb6740d676-kube-api-access-9qlv6\") pod \"metallb-operator-controller-manager-56b6c48bf8-msr9x\" (UID: \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\") " pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.895427 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08dc3cf9-a8da-48f8-bb23-77eb6740d676-webhook-cert\") pod \"metallb-operator-controller-manager-56b6c48bf8-msr9x\" (UID: \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\") " pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.906177 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08dc3cf9-a8da-48f8-bb23-77eb6740d676-apiservice-cert\") pod \"metallb-operator-controller-manager-56b6c48bf8-msr9x\" (UID: \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\") " pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.914412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08dc3cf9-a8da-48f8-bb23-77eb6740d676-webhook-cert\") pod \"metallb-operator-controller-manager-56b6c48bf8-msr9x\" (UID: \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\") " pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.944021 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qlv6\" (UniqueName: \"kubernetes.io/projected/08dc3cf9-a8da-48f8-bb23-77eb6740d676-kube-api-access-9qlv6\") pod \"metallb-operator-controller-manager-56b6c48bf8-msr9x\" (UID: \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\") " pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.957899 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw"] Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.959190 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:26:53 crc kubenswrapper[4764]: I1001 16:26:53.976145 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw"] Oct 01 16:26:54 crc kubenswrapper[4764]: I1001 16:26:54.061133 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:26:54 crc kubenswrapper[4764]: I1001 16:26:54.135718 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-apiservice-cert\") pod \"metallb-operator-webhook-server-74c8b6564b-hsdqw\" (UID: \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\") " pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:26:54 crc kubenswrapper[4764]: I1001 16:26:54.135807 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wbn6\" (UniqueName: \"kubernetes.io/projected/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-kube-api-access-5wbn6\") pod \"metallb-operator-webhook-server-74c8b6564b-hsdqw\" (UID: \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\") " pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:26:54 crc kubenswrapper[4764]: I1001 16:26:54.135850 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-webhook-cert\") pod \"metallb-operator-webhook-server-74c8b6564b-hsdqw\" (UID: \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\") " pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:26:54 crc kubenswrapper[4764]: I1001 16:26:54.237384 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-webhook-cert\") pod \"metallb-operator-webhook-server-74c8b6564b-hsdqw\" (UID: \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\") " pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:26:54 crc kubenswrapper[4764]: I1001 16:26:54.237842 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-apiservice-cert\") pod \"metallb-operator-webhook-server-74c8b6564b-hsdqw\" (UID: \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\") " pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:26:54 crc kubenswrapper[4764]: I1001 16:26:54.238271 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wbn6\" (UniqueName: \"kubernetes.io/projected/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-kube-api-access-5wbn6\") pod \"metallb-operator-webhook-server-74c8b6564b-hsdqw\" (UID: \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\") " pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:26:54 crc kubenswrapper[4764]: I1001 16:26:54.245962 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-apiservice-cert\") pod \"metallb-operator-webhook-server-74c8b6564b-hsdqw\" (UID: \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\") " pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:26:54 crc kubenswrapper[4764]: I1001 16:26:54.247680 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-webhook-cert\") pod \"metallb-operator-webhook-server-74c8b6564b-hsdqw\" (UID: \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\") " pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:26:54 crc kubenswrapper[4764]: I1001 16:26:54.270762 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wbn6\" (UniqueName: \"kubernetes.io/projected/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-kube-api-access-5wbn6\") pod \"metallb-operator-webhook-server-74c8b6564b-hsdqw\" (UID: \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\") " pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:26:54 crc kubenswrapper[4764]: I1001 16:26:54.331283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:26:55 crc kubenswrapper[4764]: I1001 16:26:55.436979 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x"] Oct 01 16:26:55 crc kubenswrapper[4764]: I1001 16:26:55.470854 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" event={"ID":"08dc3cf9-a8da-48f8-bb23-77eb6740d676","Type":"ContainerStarted","Data":"37329a5537cabdb2ceff0964c171ff51dd5ae5f04b5c37d1680553df257a113f"} Oct 01 16:26:55 crc kubenswrapper[4764]: I1001 16:26:55.473796 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dg2fb" event={"ID":"e8f3e872-b4e9-4b58-95aa-f63812824933","Type":"ContainerStarted","Data":"d862d0435a2a69f27379dfaa16ca6ec66336f337606bd3bd78613ffad2e698a1"} Oct 01 16:26:55 crc kubenswrapper[4764]: I1001 16:26:55.515762 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dg2fb" podStartSLOduration=1.698199402 podStartE2EDuration="12.515728443s" podCreationTimestamp="2025-10-01 16:26:43 +0000 UTC" firstStartedPulling="2025-10-01 16:26:44.037679103 +0000 UTC m=+1467.037325938" lastFinishedPulling="2025-10-01 16:26:54.855208144 +0000 UTC m=+1477.854854979" observedRunningTime="2025-10-01 16:26:55.509017418 +0000 UTC m=+1478.508664263" watchObservedRunningTime="2025-10-01 16:26:55.515728443 +0000 UTC m=+1478.515375278" Oct 01 16:26:55 crc kubenswrapper[4764]: I1001 16:26:55.524373 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw"] Oct 01 16:26:56 crc kubenswrapper[4764]: I1001 16:26:56.482516 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" event={"ID":"ca2c1940-6297-4207-b2d1-16d5fcb28d5e","Type":"ContainerStarted","Data":"2332cda9153c00371a1ef45f2e620b27e1940a593941bc27ad2f11f228eb3d11"} Oct 01 16:26:58 crc kubenswrapper[4764]: I1001 16:26:58.561842 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-snhkz" Oct 01 16:27:02 crc kubenswrapper[4764]: I1001 16:27:02.562742 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" event={"ID":"ca2c1940-6297-4207-b2d1-16d5fcb28d5e","Type":"ContainerStarted","Data":"08138863923be7b2ac1dd93bc5e1f2422461d6ce0da0271687ff1681c757d747"} Oct 01 16:27:02 crc kubenswrapper[4764]: I1001 16:27:02.563399 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:27:02 crc kubenswrapper[4764]: I1001 16:27:02.565203 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" event={"ID":"08dc3cf9-a8da-48f8-bb23-77eb6740d676","Type":"ContainerStarted","Data":"d5a769fab2201d441444512d23b7c4d099250ae4f50b75e66a21520a0abc09f9"} Oct 01 16:27:02 crc kubenswrapper[4764]: I1001 16:27:02.565372 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:27:02 crc kubenswrapper[4764]: I1001 16:27:02.658907 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" podStartSLOduration=3.296898148 podStartE2EDuration="9.658883846s" podCreationTimestamp="2025-10-01 16:26:53 +0000 UTC" firstStartedPulling="2025-10-01 16:26:55.535639061 +0000 UTC m=+1478.535285896" lastFinishedPulling="2025-10-01 16:27:01.897624759 +0000 UTC m=+1484.897271594" observedRunningTime="2025-10-01 16:27:02.608723017 +0000 UTC m=+1485.608369872" watchObservedRunningTime="2025-10-01 16:27:02.658883846 +0000 UTC m=+1485.658530681" Oct 01 16:27:02 crc kubenswrapper[4764]: I1001 16:27:02.665786 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" podStartSLOduration=3.239179953 podStartE2EDuration="9.665767564s" podCreationTimestamp="2025-10-01 16:26:53 +0000 UTC" firstStartedPulling="2025-10-01 16:26:55.440273525 +0000 UTC m=+1478.439920360" lastFinishedPulling="2025-10-01 16:27:01.866861136 +0000 UTC m=+1484.866507971" observedRunningTime="2025-10-01 16:27:02.641877659 +0000 UTC m=+1485.641524494" watchObservedRunningTime="2025-10-01 16:27:02.665767564 +0000 UTC m=+1485.665414399" Oct 01 16:27:03 crc kubenswrapper[4764]: I1001 16:27:03.748270 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bmnmm" Oct 01 16:27:14 crc kubenswrapper[4764]: I1001 16:27:14.338646 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:27:14 crc kubenswrapper[4764]: I1001 16:27:14.445159 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf"] Oct 01 16:27:14 crc kubenswrapper[4764]: I1001 16:27:14.445403 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" podUID="b985bc65-1311-4e4b-a037-f73bb4178c84" containerName="webhook-server" containerID="cri-o://6d94563ba329750d57d1b7f28cd92bff0b61c481aaf512332043cd4518d4940d" gracePeriod=2 Oct 01 16:27:14 crc kubenswrapper[4764]: I1001 16:27:14.455523 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf"] Oct 01 16:27:14 crc kubenswrapper[4764]: I1001 16:27:14.702704 4764 generic.go:334] "Generic (PLEG): container finished" podID="b985bc65-1311-4e4b-a037-f73bb4178c84" containerID="6d94563ba329750d57d1b7f28cd92bff0b61c481aaf512332043cd4518d4940d" exitCode=0 Oct 01 16:27:14 crc kubenswrapper[4764]: I1001 16:27:14.906776 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:27:14 crc kubenswrapper[4764]: I1001 16:27:14.976414 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b985bc65-1311-4e4b-a037-f73bb4178c84-apiservice-cert\") pod \"b985bc65-1311-4e4b-a037-f73bb4178c84\" (UID: \"b985bc65-1311-4e4b-a037-f73bb4178c84\") " Oct 01 16:27:14 crc kubenswrapper[4764]: I1001 16:27:14.976556 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b985bc65-1311-4e4b-a037-f73bb4178c84-webhook-cert\") pod \"b985bc65-1311-4e4b-a037-f73bb4178c84\" (UID: \"b985bc65-1311-4e4b-a037-f73bb4178c84\") " Oct 01 16:27:14 crc kubenswrapper[4764]: I1001 16:27:14.976632 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8n2r\" (UniqueName: \"kubernetes.io/projected/b985bc65-1311-4e4b-a037-f73bb4178c84-kube-api-access-x8n2r\") pod \"b985bc65-1311-4e4b-a037-f73bb4178c84\" (UID: \"b985bc65-1311-4e4b-a037-f73bb4178c84\") " Oct 01 16:27:14 crc kubenswrapper[4764]: I1001 16:27:14.982936 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b985bc65-1311-4e4b-a037-f73bb4178c84-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "b985bc65-1311-4e4b-a037-f73bb4178c84" (UID: "b985bc65-1311-4e4b-a037-f73bb4178c84"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:27:14 crc kubenswrapper[4764]: I1001 16:27:14.982980 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b985bc65-1311-4e4b-a037-f73bb4178c84-kube-api-access-x8n2r" (OuterVolumeSpecName: "kube-api-access-x8n2r") pod "b985bc65-1311-4e4b-a037-f73bb4178c84" (UID: "b985bc65-1311-4e4b-a037-f73bb4178c84"). InnerVolumeSpecName "kube-api-access-x8n2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:27:14 crc kubenswrapper[4764]: I1001 16:27:14.983956 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b985bc65-1311-4e4b-a037-f73bb4178c84-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "b985bc65-1311-4e4b-a037-f73bb4178c84" (UID: "b985bc65-1311-4e4b-a037-f73bb4178c84"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:27:15 crc kubenswrapper[4764]: I1001 16:27:15.078310 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b985bc65-1311-4e4b-a037-f73bb4178c84-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:15 crc kubenswrapper[4764]: I1001 16:27:15.078344 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8n2r\" (UniqueName: \"kubernetes.io/projected/b985bc65-1311-4e4b-a037-f73bb4178c84-kube-api-access-x8n2r\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:15 crc kubenswrapper[4764]: I1001 16:27:15.078354 4764 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b985bc65-1311-4e4b-a037-f73bb4178c84-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:15 crc kubenswrapper[4764]: I1001 16:27:15.711723 4764 scope.go:117] "RemoveContainer" containerID="6d94563ba329750d57d1b7f28cd92bff0b61c481aaf512332043cd4518d4940d" Oct 01 16:27:15 crc kubenswrapper[4764]: I1001 16:27:15.712103 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6748f49456-mtpnf" Oct 01 16:27:15 crc kubenswrapper[4764]: I1001 16:27:15.737325 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b985bc65-1311-4e4b-a037-f73bb4178c84" path="/var/lib/kubelet/pods/b985bc65-1311-4e4b-a037-f73bb4178c84/volumes" Oct 01 16:27:16 crc kubenswrapper[4764]: I1001 16:27:16.727267 4764 generic.go:334] "Generic (PLEG): container finished" podID="66071483-0a25-4d14-afea-3f08fe54ddc5" containerID="cb90743ae2f3ca72f211a1043c24b28326f767dde457945e57513182d60721d0" exitCode=0 Oct 01 16:27:16 crc kubenswrapper[4764]: I1001 16:27:16.727311 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" event={"ID":"66071483-0a25-4d14-afea-3f08fe54ddc5","Type":"ContainerDied","Data":"cb90743ae2f3ca72f211a1043c24b28326f767dde457945e57513182d60721d0"} Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.256684 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.338656 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-bootstrap-combined-ca-bundle\") pod \"66071483-0a25-4d14-afea-3f08fe54ddc5\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.338852 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5p52\" (UniqueName: \"kubernetes.io/projected/66071483-0a25-4d14-afea-3f08fe54ddc5-kube-api-access-m5p52\") pod \"66071483-0a25-4d14-afea-3f08fe54ddc5\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.338887 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-inventory\") pod \"66071483-0a25-4d14-afea-3f08fe54ddc5\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.338965 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-ssh-key\") pod \"66071483-0a25-4d14-afea-3f08fe54ddc5\" (UID: \"66071483-0a25-4d14-afea-3f08fe54ddc5\") " Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.345629 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66071483-0a25-4d14-afea-3f08fe54ddc5-kube-api-access-m5p52" (OuterVolumeSpecName: "kube-api-access-m5p52") pod "66071483-0a25-4d14-afea-3f08fe54ddc5" (UID: "66071483-0a25-4d14-afea-3f08fe54ddc5"). InnerVolumeSpecName "kube-api-access-m5p52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.360740 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "66071483-0a25-4d14-afea-3f08fe54ddc5" (UID: "66071483-0a25-4d14-afea-3f08fe54ddc5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.378332 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "66071483-0a25-4d14-afea-3f08fe54ddc5" (UID: "66071483-0a25-4d14-afea-3f08fe54ddc5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.383795 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-inventory" (OuterVolumeSpecName: "inventory") pod "66071483-0a25-4d14-afea-3f08fe54ddc5" (UID: "66071483-0a25-4d14-afea-3f08fe54ddc5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.441266 4764 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.441298 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5p52\" (UniqueName: \"kubernetes.io/projected/66071483-0a25-4d14-afea-3f08fe54ddc5-kube-api-access-m5p52\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.441310 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.441321 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/66071483-0a25-4d14-afea-3f08fe54ddc5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.749110 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" event={"ID":"66071483-0a25-4d14-afea-3f08fe54ddc5","Type":"ContainerDied","Data":"420245b555ae691141edcdccf0232637811fb0719cc461df1bae80f9948e8e93"} Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.749407 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="420245b555ae691141edcdccf0232637811fb0719cc461df1bae80f9948e8e93" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.749215 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.841104 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm"] Oct 01 16:27:18 crc kubenswrapper[4764]: E1001 16:27:18.841522 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b985bc65-1311-4e4b-a037-f73bb4178c84" containerName="webhook-server" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.841553 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b985bc65-1311-4e4b-a037-f73bb4178c84" containerName="webhook-server" Oct 01 16:27:18 crc kubenswrapper[4764]: E1001 16:27:18.841607 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66071483-0a25-4d14-afea-3f08fe54ddc5" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.841617 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="66071483-0a25-4d14-afea-3f08fe54ddc5" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.841810 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="66071483-0a25-4d14-afea-3f08fe54ddc5" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.841848 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b985bc65-1311-4e4b-a037-f73bb4178c84" containerName="webhook-server" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.843271 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.845392 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.845795 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.848092 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.849031 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.875835 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm"] Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.995469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28wm2\" (UniqueName: \"kubernetes.io/projected/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-kube-api-access-28wm2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdntm\" (UID: \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.995570 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdntm\" (UID: \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" Oct 01 16:27:18 crc kubenswrapper[4764]: I1001 16:27:18.995590 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdntm\" (UID: \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" Oct 01 16:27:19 crc kubenswrapper[4764]: I1001 16:27:19.097396 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28wm2\" (UniqueName: \"kubernetes.io/projected/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-kube-api-access-28wm2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdntm\" (UID: \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" Oct 01 16:27:19 crc kubenswrapper[4764]: I1001 16:27:19.097722 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdntm\" (UID: \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" Oct 01 16:27:19 crc kubenswrapper[4764]: I1001 16:27:19.097852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdntm\" (UID: \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" Oct 01 16:27:19 crc kubenswrapper[4764]: I1001 16:27:19.102843 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdntm\" (UID: \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" Oct 01 16:27:19 crc kubenswrapper[4764]: I1001 16:27:19.115133 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28wm2\" (UniqueName: \"kubernetes.io/projected/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-kube-api-access-28wm2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdntm\" (UID: \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" Oct 01 16:27:19 crc kubenswrapper[4764]: I1001 16:27:19.122003 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kdntm\" (UID: \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" Oct 01 16:27:19 crc kubenswrapper[4764]: I1001 16:27:19.172711 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" Oct 01 16:27:19 crc kubenswrapper[4764]: I1001 16:27:19.665278 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm"] Oct 01 16:27:19 crc kubenswrapper[4764]: I1001 16:27:19.770776 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" event={"ID":"a4b541d9-8d8f-4dfb-9c59-f00ba871257c","Type":"ContainerStarted","Data":"fce90c6ca0cdc67d5fec2a63c67723d61c48ac1f2341ffc36389941294175bc1"} Oct 01 16:27:20 crc kubenswrapper[4764]: I1001 16:27:20.783299 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" event={"ID":"a4b541d9-8d8f-4dfb-9c59-f00ba871257c","Type":"ContainerStarted","Data":"8a5921afb3a3713d040a30cb566f6eb2d40f370557ec018a3b780d0ffde91301"} Oct 01 16:27:20 crc kubenswrapper[4764]: I1001 16:27:20.814388 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" podStartSLOduration=2.37082578 podStartE2EDuration="2.814365243s" podCreationTimestamp="2025-10-01 16:27:18 +0000 UTC" firstStartedPulling="2025-10-01 16:27:19.674833667 +0000 UTC m=+1502.674480512" lastFinishedPulling="2025-10-01 16:27:20.1183731 +0000 UTC m=+1503.118019975" observedRunningTime="2025-10-01 16:27:20.805345624 +0000 UTC m=+1503.804992479" watchObservedRunningTime="2025-10-01 16:27:20.814365243 +0000 UTC m=+1503.814012078" Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.066959 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.215108 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms"] Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.215426 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" podUID="a75ea530-e233-41d3-a607-07ad2b30c4f1" containerName="manager" containerID="cri-o://06e2cc05fea9cf8f422ca7a0d6bdea761b874450a8277653ffdb0c345041d949" gracePeriod=10 Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.751833 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.892472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrsvq\" (UniqueName: \"kubernetes.io/projected/a75ea530-e233-41d3-a607-07ad2b30c4f1-kube-api-access-zrsvq\") pod \"a75ea530-e233-41d3-a607-07ad2b30c4f1\" (UID: \"a75ea530-e233-41d3-a607-07ad2b30c4f1\") " Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.892520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a75ea530-e233-41d3-a607-07ad2b30c4f1-webhook-cert\") pod \"a75ea530-e233-41d3-a607-07ad2b30c4f1\" (UID: \"a75ea530-e233-41d3-a607-07ad2b30c4f1\") " Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.892639 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a75ea530-e233-41d3-a607-07ad2b30c4f1-apiservice-cert\") pod \"a75ea530-e233-41d3-a607-07ad2b30c4f1\" (UID: \"a75ea530-e233-41d3-a607-07ad2b30c4f1\") " Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.900039 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75ea530-e233-41d3-a607-07ad2b30c4f1-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "a75ea530-e233-41d3-a607-07ad2b30c4f1" (UID: "a75ea530-e233-41d3-a607-07ad2b30c4f1"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.900205 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75ea530-e233-41d3-a607-07ad2b30c4f1-kube-api-access-zrsvq" (OuterVolumeSpecName: "kube-api-access-zrsvq") pod "a75ea530-e233-41d3-a607-07ad2b30c4f1" (UID: "a75ea530-e233-41d3-a607-07ad2b30c4f1"). InnerVolumeSpecName "kube-api-access-zrsvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.900214 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75ea530-e233-41d3-a607-07ad2b30c4f1-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "a75ea530-e233-41d3-a607-07ad2b30c4f1" (UID: "a75ea530-e233-41d3-a607-07ad2b30c4f1"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.929472 4764 generic.go:334] "Generic (PLEG): container finished" podID="a75ea530-e233-41d3-a607-07ad2b30c4f1" containerID="06e2cc05fea9cf8f422ca7a0d6bdea761b874450a8277653ffdb0c345041d949" exitCode=0 Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.929532 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" event={"ID":"a75ea530-e233-41d3-a607-07ad2b30c4f1","Type":"ContainerDied","Data":"06e2cc05fea9cf8f422ca7a0d6bdea761b874450a8277653ffdb0c345041d949"} Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.929565 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.929599 4764 scope.go:117] "RemoveContainer" containerID="06e2cc05fea9cf8f422ca7a0d6bdea761b874450a8277653ffdb0c345041d949" Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.929580 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms" event={"ID":"a75ea530-e233-41d3-a607-07ad2b30c4f1","Type":"ContainerDied","Data":"edd3f3289bd44e4daa76dcbed8e719922a5aed416c837440722c0ec8da2effec"} Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.997886 4764 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a75ea530-e233-41d3-a607-07ad2b30c4f1-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.997945 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrsvq\" (UniqueName: \"kubernetes.io/projected/a75ea530-e233-41d3-a607-07ad2b30c4f1-kube-api-access-zrsvq\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:34 crc kubenswrapper[4764]: I1001 16:27:34.997967 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a75ea530-e233-41d3-a607-07ad2b30c4f1-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:35 crc kubenswrapper[4764]: I1001 16:27:35.028310 4764 scope.go:117] "RemoveContainer" containerID="06e2cc05fea9cf8f422ca7a0d6bdea761b874450a8277653ffdb0c345041d949" Oct 01 16:27:35 crc kubenswrapper[4764]: E1001 16:27:35.029398 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e2cc05fea9cf8f422ca7a0d6bdea761b874450a8277653ffdb0c345041d949\": container with ID starting with 06e2cc05fea9cf8f422ca7a0d6bdea761b874450a8277653ffdb0c345041d949 not found: ID does not exist" containerID="06e2cc05fea9cf8f422ca7a0d6bdea761b874450a8277653ffdb0c345041d949" Oct 01 16:27:35 crc kubenswrapper[4764]: I1001 16:27:35.029455 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e2cc05fea9cf8f422ca7a0d6bdea761b874450a8277653ffdb0c345041d949"} err="failed to get container status \"06e2cc05fea9cf8f422ca7a0d6bdea761b874450a8277653ffdb0c345041d949\": rpc error: code = NotFound desc = could not find container \"06e2cc05fea9cf8f422ca7a0d6bdea761b874450a8277653ffdb0c345041d949\": container with ID starting with 06e2cc05fea9cf8f422ca7a0d6bdea761b874450a8277653ffdb0c345041d949 not found: ID does not exist" Oct 01 16:27:35 crc kubenswrapper[4764]: I1001 16:27:35.033445 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms"] Oct 01 16:27:35 crc kubenswrapper[4764]: I1001 16:27:35.043724 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6694fb7ccb-888ms"] Oct 01 16:27:35 crc kubenswrapper[4764]: I1001 16:27:35.738602 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75ea530-e233-41d3-a607-07ad2b30c4f1" path="/var/lib/kubelet/pods/a75ea530-e233-41d3-a607-07ad2b30c4f1/volumes" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.155440 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5"] Oct 01 16:27:43 crc kubenswrapper[4764]: E1001 16:27:43.156455 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75ea530-e233-41d3-a607-07ad2b30c4f1" containerName="manager" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.156470 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75ea530-e233-41d3-a607-07ad2b30c4f1" containerName="manager" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.156669 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75ea530-e233-41d3-a607-07ad2b30c4f1" containerName="manager" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.157483 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.168796 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5"] Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.293351 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3b485840-2253-4eb1-888b-5e16d76a3a3d-apiservice-cert\") pod \"metallb-operator-controller-manager-845477fbc7-z8qk5\" (UID: \"3b485840-2253-4eb1-888b-5e16d76a3a3d\") " pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.293443 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3b485840-2253-4eb1-888b-5e16d76a3a3d-webhook-cert\") pod \"metallb-operator-controller-manager-845477fbc7-z8qk5\" (UID: \"3b485840-2253-4eb1-888b-5e16d76a3a3d\") " pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.293565 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8zbd\" (UniqueName: \"kubernetes.io/projected/3b485840-2253-4eb1-888b-5e16d76a3a3d-kube-api-access-m8zbd\") pod \"metallb-operator-controller-manager-845477fbc7-z8qk5\" (UID: \"3b485840-2253-4eb1-888b-5e16d76a3a3d\") " pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.395092 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8zbd\" (UniqueName: \"kubernetes.io/projected/3b485840-2253-4eb1-888b-5e16d76a3a3d-kube-api-access-m8zbd\") pod \"metallb-operator-controller-manager-845477fbc7-z8qk5\" (UID: \"3b485840-2253-4eb1-888b-5e16d76a3a3d\") " pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.395190 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3b485840-2253-4eb1-888b-5e16d76a3a3d-apiservice-cert\") pod \"metallb-operator-controller-manager-845477fbc7-z8qk5\" (UID: \"3b485840-2253-4eb1-888b-5e16d76a3a3d\") " pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.395292 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3b485840-2253-4eb1-888b-5e16d76a3a3d-webhook-cert\") pod \"metallb-operator-controller-manager-845477fbc7-z8qk5\" (UID: \"3b485840-2253-4eb1-888b-5e16d76a3a3d\") " pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.399086 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck"] Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.400595 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3b485840-2253-4eb1-888b-5e16d76a3a3d-apiservice-cert\") pod \"metallb-operator-controller-manager-845477fbc7-z8qk5\" (UID: \"3b485840-2253-4eb1-888b-5e16d76a3a3d\") " pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.402727 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.412472 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck"] Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.413557 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3b485840-2253-4eb1-888b-5e16d76a3a3d-webhook-cert\") pod \"metallb-operator-controller-manager-845477fbc7-z8qk5\" (UID: \"3b485840-2253-4eb1-888b-5e16d76a3a3d\") " pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.430172 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8zbd\" (UniqueName: \"kubernetes.io/projected/3b485840-2253-4eb1-888b-5e16d76a3a3d-kube-api-access-m8zbd\") pod \"metallb-operator-controller-manager-845477fbc7-z8qk5\" (UID: \"3b485840-2253-4eb1-888b-5e16d76a3a3d\") " pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.473504 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.498125 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/54e1ea89-8cd7-4a1c-a1b9-e20f321198f7-apiservice-cert\") pod \"metallb-operator-webhook-server-755d79df97-5b9ck\" (UID: \"54e1ea89-8cd7-4a1c-a1b9-e20f321198f7\") " pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.498432 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq6hh\" (UniqueName: \"kubernetes.io/projected/54e1ea89-8cd7-4a1c-a1b9-e20f321198f7-kube-api-access-vq6hh\") pod \"metallb-operator-webhook-server-755d79df97-5b9ck\" (UID: \"54e1ea89-8cd7-4a1c-a1b9-e20f321198f7\") " pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.498472 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/54e1ea89-8cd7-4a1c-a1b9-e20f321198f7-webhook-cert\") pod \"metallb-operator-webhook-server-755d79df97-5b9ck\" (UID: \"54e1ea89-8cd7-4a1c-a1b9-e20f321198f7\") " pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.602073 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/54e1ea89-8cd7-4a1c-a1b9-e20f321198f7-apiservice-cert\") pod \"metallb-operator-webhook-server-755d79df97-5b9ck\" (UID: \"54e1ea89-8cd7-4a1c-a1b9-e20f321198f7\") " pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.602178 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq6hh\" (UniqueName: \"kubernetes.io/projected/54e1ea89-8cd7-4a1c-a1b9-e20f321198f7-kube-api-access-vq6hh\") pod \"metallb-operator-webhook-server-755d79df97-5b9ck\" (UID: \"54e1ea89-8cd7-4a1c-a1b9-e20f321198f7\") " pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.602217 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/54e1ea89-8cd7-4a1c-a1b9-e20f321198f7-webhook-cert\") pod \"metallb-operator-webhook-server-755d79df97-5b9ck\" (UID: \"54e1ea89-8cd7-4a1c-a1b9-e20f321198f7\") " pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.607837 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/54e1ea89-8cd7-4a1c-a1b9-e20f321198f7-apiservice-cert\") pod \"metallb-operator-webhook-server-755d79df97-5b9ck\" (UID: \"54e1ea89-8cd7-4a1c-a1b9-e20f321198f7\") " pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.619765 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/54e1ea89-8cd7-4a1c-a1b9-e20f321198f7-webhook-cert\") pod \"metallb-operator-webhook-server-755d79df97-5b9ck\" (UID: \"54e1ea89-8cd7-4a1c-a1b9-e20f321198f7\") " pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.631700 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq6hh\" (UniqueName: \"kubernetes.io/projected/54e1ea89-8cd7-4a1c-a1b9-e20f321198f7-kube-api-access-vq6hh\") pod \"metallb-operator-webhook-server-755d79df97-5b9ck\" (UID: \"54e1ea89-8cd7-4a1c-a1b9-e20f321198f7\") " pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" Oct 01 16:27:43 crc kubenswrapper[4764]: I1001 16:27:43.817892 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" Oct 01 16:27:44 crc kubenswrapper[4764]: I1001 16:27:44.042512 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5"] Oct 01 16:27:44 crc kubenswrapper[4764]: W1001 16:27:44.043011 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b485840_2253_4eb1_888b_5e16d76a3a3d.slice/crio-12660f4095e9d515cbca99ef76868134f66c93bfd72c73aef5cf307a546b62e0 WatchSource:0}: Error finding container 12660f4095e9d515cbca99ef76868134f66c93bfd72c73aef5cf307a546b62e0: Status 404 returned error can't find the container with id 12660f4095e9d515cbca99ef76868134f66c93bfd72c73aef5cf307a546b62e0 Oct 01 16:27:44 crc kubenswrapper[4764]: I1001 16:27:44.283917 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck"] Oct 01 16:27:44 crc kubenswrapper[4764]: W1001 16:27:44.291247 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54e1ea89_8cd7_4a1c_a1b9_e20f321198f7.slice/crio-3c18857f59d08454b22a03d8f318227ec03038ca9858bc821390bfe21a896165 WatchSource:0}: Error finding container 3c18857f59d08454b22a03d8f318227ec03038ca9858bc821390bfe21a896165: Status 404 returned error can't find the container with id 3c18857f59d08454b22a03d8f318227ec03038ca9858bc821390bfe21a896165 Oct 01 16:27:44 crc kubenswrapper[4764]: I1001 16:27:44.468290 4764 scope.go:117] "RemoveContainer" containerID="b4c12b63e1ee8008fa5648fa6cbbe020eda19b95f4a985767e3c949da3e9e973" Oct 01 16:27:44 crc kubenswrapper[4764]: I1001 16:27:44.515038 4764 scope.go:117] "RemoveContainer" containerID="2aabedb919decbef54e40c4f1906f9b2f6aabf0ccffe729437036193121cf58d" Oct 01 16:27:44 crc kubenswrapper[4764]: I1001 16:27:44.536899 4764 scope.go:117] "RemoveContainer" containerID="03cdb772aa1c9e95835840fb1b5823a012e11cb175b6bd8f4750ce44e48aa7fa" Oct 01 16:27:44 crc kubenswrapper[4764]: I1001 16:27:44.562856 4764 scope.go:117] "RemoveContainer" containerID="5e2855d6c89e7d34ad2846a73c8a485eb15b8dba6a1d8db70fe39daca0369ab0" Oct 01 16:27:45 crc kubenswrapper[4764]: I1001 16:27:45.025031 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" event={"ID":"54e1ea89-8cd7-4a1c-a1b9-e20f321198f7","Type":"ContainerStarted","Data":"744d3f1847f8e821ce5dd3c1735d08f4361c8e110641a148d74be36e2bea57a7"} Oct 01 16:27:45 crc kubenswrapper[4764]: I1001 16:27:45.025111 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" event={"ID":"54e1ea89-8cd7-4a1c-a1b9-e20f321198f7","Type":"ContainerStarted","Data":"3c18857f59d08454b22a03d8f318227ec03038ca9858bc821390bfe21a896165"} Oct 01 16:27:45 crc kubenswrapper[4764]: I1001 16:27:45.032276 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" event={"ID":"3b485840-2253-4eb1-888b-5e16d76a3a3d","Type":"ContainerStarted","Data":"07b41b6f1cd8091f56c204e4e71d41a37061e683072eef17e6c69a1dc284cdf0"} Oct 01 16:27:45 crc kubenswrapper[4764]: I1001 16:27:45.032376 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" Oct 01 16:27:45 crc kubenswrapper[4764]: I1001 16:27:45.032393 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" event={"ID":"3b485840-2253-4eb1-888b-5e16d76a3a3d","Type":"ContainerStarted","Data":"12660f4095e9d515cbca99ef76868134f66c93bfd72c73aef5cf307a546b62e0"} Oct 01 16:27:45 crc kubenswrapper[4764]: I1001 16:27:45.049684 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" podStartSLOduration=2.049659914 podStartE2EDuration="2.049659914s" podCreationTimestamp="2025-10-01 16:27:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:27:45.047379308 +0000 UTC m=+1528.047026163" watchObservedRunningTime="2025-10-01 16:27:45.049659914 +0000 UTC m=+1528.049306749" Oct 01 16:27:46 crc kubenswrapper[4764]: I1001 16:27:46.043340 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.323592 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" podStartSLOduration=9.323572246 podStartE2EDuration="9.323572246s" podCreationTimestamp="2025-10-01 16:27:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:27:45.07838166 +0000 UTC m=+1528.078028495" watchObservedRunningTime="2025-10-01 16:27:52.323572246 +0000 UTC m=+1535.323219081" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.330701 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["metallb-system/frr-k8s-6tr29"] Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.331287 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-6tr29" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="frr-metrics" containerID="cri-o://2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0" gracePeriod=2 Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.331306 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-6tr29" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="kube-rbac-proxy" containerID="cri-o://e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d" gracePeriod=2 Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.331429 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-6tr29" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="kube-rbac-proxy-frr" containerID="cri-o://2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf" gracePeriod=2 Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.331476 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-6tr29" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="reloader" containerID="cri-o://41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76" gracePeriod=2 Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.331536 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-6tr29" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="frr" containerID="cri-o://e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094" gracePeriod=2 Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.331616 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-6tr29" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="controller" containerID="cri-o://5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328" gracePeriod=2 Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.340626 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["metallb-system/frr-k8s-6tr29"] Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.362032 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff"] Oct 01 16:27:52 crc kubenswrapper[4764]: E1001 16:27:52.362605 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="reloader" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.362621 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="reloader" Oct 01 16:27:52 crc kubenswrapper[4764]: E1001 16:27:52.362654 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="cp-reloader" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.362663 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="cp-reloader" Oct 01 16:27:52 crc kubenswrapper[4764]: E1001 16:27:52.362698 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="frr" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.362708 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="frr" Oct 01 16:27:52 crc kubenswrapper[4764]: E1001 16:27:52.362720 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="cp-frr-files" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.362727 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="cp-frr-files" Oct 01 16:27:52 crc kubenswrapper[4764]: E1001 16:27:52.362737 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="cp-metrics" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.362746 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="cp-metrics" Oct 01 16:27:52 crc kubenswrapper[4764]: E1001 16:27:52.362763 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="frr-metrics" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.362770 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="frr-metrics" Oct 01 16:27:52 crc kubenswrapper[4764]: E1001 16:27:52.362781 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="kube-rbac-proxy-frr" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.362788 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="kube-rbac-proxy-frr" Oct 01 16:27:52 crc kubenswrapper[4764]: E1001 16:27:52.362812 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="kube-rbac-proxy" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.362819 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="kube-rbac-proxy" Oct 01 16:27:52 crc kubenswrapper[4764]: E1001 16:27:52.362832 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="controller" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.362838 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="controller" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.363063 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="kube-rbac-proxy-frr" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.363084 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="reloader" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.363094 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="frr" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.363104 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="frr-metrics" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.363116 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="kube-rbac-proxy" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.363126 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerName="controller" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.363988 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.382574 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff"] Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.417713 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-k2274"] Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.426833 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.440096 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["metallb-system/speaker-dq7mc"] Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.440346 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/speaker-dq7mc" podUID="5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" containerName="speaker" containerID="cri-o://97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a" gracePeriod=2 Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.440490 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/speaker-dq7mc" podUID="5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" containerName="kube-rbac-proxy" containerID="cri-o://74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1" gracePeriod=2 Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.457196 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["metallb-system/speaker-dq7mc"] Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.478958 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-mxkb8"] Oct 01 16:27:52 crc kubenswrapper[4764]: E1001 16:27:52.479714 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" containerName="speaker" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.479740 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" containerName="speaker" Oct 01 16:27:52 crc kubenswrapper[4764]: E1001 16:27:52.479764 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" containerName="kube-rbac-proxy" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.479773 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" containerName="kube-rbac-proxy" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.479987 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" containerName="kube-rbac-proxy" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.480027 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" containerName="speaker" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.480254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d7c7ca03-94fe-4d3b-914b-669bfd41d526-frr-startup\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.480309 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7c7ca03-94fe-4d3b-914b-669bfd41d526-metrics-certs\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.480340 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d7c7ca03-94fe-4d3b-914b-669bfd41d526-frr-sockets\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.480371 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfkx\" (UniqueName: \"kubernetes.io/projected/247af8f1-5e4b-4e17-9d31-055bdce2a1d6-kube-api-access-shfkx\") pod \"frr-k8s-webhook-server-64bf5d555-bhdff\" (UID: \"247af8f1-5e4b-4e17-9d31-055bdce2a1d6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.480409 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/247af8f1-5e4b-4e17-9d31-055bdce2a1d6-cert\") pod \"frr-k8s-webhook-server-64bf5d555-bhdff\" (UID: \"247af8f1-5e4b-4e17-9d31-055bdce2a1d6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.480423 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d7c7ca03-94fe-4d3b-914b-669bfd41d526-reloader\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.480457 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d7c7ca03-94fe-4d3b-914b-669bfd41d526-metrics\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.480555 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d7c7ca03-94fe-4d3b-914b-669bfd41d526-frr-conf\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.480589 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clmgw\" (UniqueName: \"kubernetes.io/projected/d7c7ca03-94fe-4d3b-914b-669bfd41d526-kube-api-access-clmgw\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.482789 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-mxkb8" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.541865 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-mxkb8"] Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.568584 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zm88x"] Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.570638 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zm88x" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.583631 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e96a209-b683-4060-8c9f-b7b1ae8c89b0-metrics-certs\") pod \"controller-68d546b9d8-mxkb8\" (UID: \"1e96a209-b683-4060-8c9f-b7b1ae8c89b0\") " pod="metallb-system/controller-68d546b9d8-mxkb8" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.583727 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d7c7ca03-94fe-4d3b-914b-669bfd41d526-frr-startup\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.583765 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7c7ca03-94fe-4d3b-914b-669bfd41d526-metrics-certs\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.583796 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d7c7ca03-94fe-4d3b-914b-669bfd41d526-frr-sockets\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.583834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e96a209-b683-4060-8c9f-b7b1ae8c89b0-cert\") pod \"controller-68d546b9d8-mxkb8\" (UID: \"1e96a209-b683-4060-8c9f-b7b1ae8c89b0\") " pod="metallb-system/controller-68d546b9d8-mxkb8" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.583861 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfkx\" (UniqueName: \"kubernetes.io/projected/247af8f1-5e4b-4e17-9d31-055bdce2a1d6-kube-api-access-shfkx\") pod \"frr-k8s-webhook-server-64bf5d555-bhdff\" (UID: \"247af8f1-5e4b-4e17-9d31-055bdce2a1d6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.583909 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d7c7ca03-94fe-4d3b-914b-669bfd41d526-reloader\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.583933 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/247af8f1-5e4b-4e17-9d31-055bdce2a1d6-cert\") pod \"frr-k8s-webhook-server-64bf5d555-bhdff\" (UID: \"247af8f1-5e4b-4e17-9d31-055bdce2a1d6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.583979 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d7c7ca03-94fe-4d3b-914b-669bfd41d526-metrics\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.584017 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d7c7ca03-94fe-4d3b-914b-669bfd41d526-frr-conf\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.584072 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clmgw\" (UniqueName: \"kubernetes.io/projected/d7c7ca03-94fe-4d3b-914b-669bfd41d526-kube-api-access-clmgw\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.584129 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57rnp\" (UniqueName: \"kubernetes.io/projected/1e96a209-b683-4060-8c9f-b7b1ae8c89b0-kube-api-access-57rnp\") pod \"controller-68d546b9d8-mxkb8\" (UID: \"1e96a209-b683-4060-8c9f-b7b1ae8c89b0\") " pod="metallb-system/controller-68d546b9d8-mxkb8" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.585326 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/d7c7ca03-94fe-4d3b-914b-669bfd41d526-reloader\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.585784 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/d7c7ca03-94fe-4d3b-914b-669bfd41d526-frr-startup\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.587220 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/d7c7ca03-94fe-4d3b-914b-669bfd41d526-frr-conf\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.588277 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/d7c7ca03-94fe-4d3b-914b-669bfd41d526-metrics\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.589118 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/d7c7ca03-94fe-4d3b-914b-669bfd41d526-frr-sockets\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.603152 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7c7ca03-94fe-4d3b-914b-669bfd41d526-metrics-certs\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.607143 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clmgw\" (UniqueName: \"kubernetes.io/projected/d7c7ca03-94fe-4d3b-914b-669bfd41d526-kube-api-access-clmgw\") pod \"frr-k8s-k2274\" (UID: \"d7c7ca03-94fe-4d3b-914b-669bfd41d526\") " pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.611269 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/247af8f1-5e4b-4e17-9d31-055bdce2a1d6-cert\") pod \"frr-k8s-webhook-server-64bf5d555-bhdff\" (UID: \"247af8f1-5e4b-4e17-9d31-055bdce2a1d6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.611590 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfkx\" (UniqueName: \"kubernetes.io/projected/247af8f1-5e4b-4e17-9d31-055bdce2a1d6-kube-api-access-shfkx\") pod \"frr-k8s-webhook-server-64bf5d555-bhdff\" (UID: \"247af8f1-5e4b-4e17-9d31-055bdce2a1d6\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.688642 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e96a209-b683-4060-8c9f-b7b1ae8c89b0-metrics-certs\") pod \"controller-68d546b9d8-mxkb8\" (UID: \"1e96a209-b683-4060-8c9f-b7b1ae8c89b0\") " pod="metallb-system/controller-68d546b9d8-mxkb8" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.688711 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0bf73b76-19c9-4264-abe9-80d24dcf6ee6-memberlist\") pod \"speaker-zm88x\" (UID: \"0bf73b76-19c9-4264-abe9-80d24dcf6ee6\") " pod="metallb-system/speaker-zm88x" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.688770 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0bf73b76-19c9-4264-abe9-80d24dcf6ee6-metallb-excludel2\") pod \"speaker-zm88x\" (UID: \"0bf73b76-19c9-4264-abe9-80d24dcf6ee6\") " pod="metallb-system/speaker-zm88x" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.688899 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e96a209-b683-4060-8c9f-b7b1ae8c89b0-cert\") pod \"controller-68d546b9d8-mxkb8\" (UID: \"1e96a209-b683-4060-8c9f-b7b1ae8c89b0\") " pod="metallb-system/controller-68d546b9d8-mxkb8" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.688999 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rqzk\" (UniqueName: \"kubernetes.io/projected/0bf73b76-19c9-4264-abe9-80d24dcf6ee6-kube-api-access-8rqzk\") pod \"speaker-zm88x\" (UID: \"0bf73b76-19c9-4264-abe9-80d24dcf6ee6\") " pod="metallb-system/speaker-zm88x" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.689184 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bf73b76-19c9-4264-abe9-80d24dcf6ee6-metrics-certs\") pod \"speaker-zm88x\" (UID: \"0bf73b76-19c9-4264-abe9-80d24dcf6ee6\") " pod="metallb-system/speaker-zm88x" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.689233 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57rnp\" (UniqueName: \"kubernetes.io/projected/1e96a209-b683-4060-8c9f-b7b1ae8c89b0-kube-api-access-57rnp\") pod \"controller-68d546b9d8-mxkb8\" (UID: \"1e96a209-b683-4060-8c9f-b7b1ae8c89b0\") " pod="metallb-system/controller-68d546b9d8-mxkb8" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.692518 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e96a209-b683-4060-8c9f-b7b1ae8c89b0-cert\") pod \"controller-68d546b9d8-mxkb8\" (UID: \"1e96a209-b683-4060-8c9f-b7b1ae8c89b0\") " pod="metallb-system/controller-68d546b9d8-mxkb8" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.693067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e96a209-b683-4060-8c9f-b7b1ae8c89b0-metrics-certs\") pod \"controller-68d546b9d8-mxkb8\" (UID: \"1e96a209-b683-4060-8c9f-b7b1ae8c89b0\") " pod="metallb-system/controller-68d546b9d8-mxkb8" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.705920 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57rnp\" (UniqueName: \"kubernetes.io/projected/1e96a209-b683-4060-8c9f-b7b1ae8c89b0-kube-api-access-57rnp\") pod \"controller-68d546b9d8-mxkb8\" (UID: \"1e96a209-b683-4060-8c9f-b7b1ae8c89b0\") " pod="metallb-system/controller-68d546b9d8-mxkb8" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.706756 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.765336 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6tr29" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.790609 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bf73b76-19c9-4264-abe9-80d24dcf6ee6-metrics-certs\") pod \"speaker-zm88x\" (UID: \"0bf73b76-19c9-4264-abe9-80d24dcf6ee6\") " pod="metallb-system/speaker-zm88x" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.790707 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0bf73b76-19c9-4264-abe9-80d24dcf6ee6-memberlist\") pod \"speaker-zm88x\" (UID: \"0bf73b76-19c9-4264-abe9-80d24dcf6ee6\") " pod="metallb-system/speaker-zm88x" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.790753 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0bf73b76-19c9-4264-abe9-80d24dcf6ee6-metallb-excludel2\") pod \"speaker-zm88x\" (UID: \"0bf73b76-19c9-4264-abe9-80d24dcf6ee6\") " pod="metallb-system/speaker-zm88x" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.791261 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rqzk\" (UniqueName: \"kubernetes.io/projected/0bf73b76-19c9-4264-abe9-80d24dcf6ee6-kube-api-access-8rqzk\") pod \"speaker-zm88x\" (UID: \"0bf73b76-19c9-4264-abe9-80d24dcf6ee6\") " pod="metallb-system/speaker-zm88x" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.792583 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/0bf73b76-19c9-4264-abe9-80d24dcf6ee6-metallb-excludel2\") pod \"speaker-zm88x\" (UID: \"0bf73b76-19c9-4264-abe9-80d24dcf6ee6\") " pod="metallb-system/speaker-zm88x" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.795907 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0bf73b76-19c9-4264-abe9-80d24dcf6ee6-metrics-certs\") pod \"speaker-zm88x\" (UID: \"0bf73b76-19c9-4264-abe9-80d24dcf6ee6\") " pod="metallb-system/speaker-zm88x" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.799282 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/0bf73b76-19c9-4264-abe9-80d24dcf6ee6-memberlist\") pod \"speaker-zm88x\" (UID: \"0bf73b76-19c9-4264-abe9-80d24dcf6ee6\") " pod="metallb-system/speaker-zm88x" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.814981 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rqzk\" (UniqueName: \"kubernetes.io/projected/0bf73b76-19c9-4264-abe9-80d24dcf6ee6-kube-api-access-8rqzk\") pod \"speaker-zm88x\" (UID: \"0bf73b76-19c9-4264-abe9-80d24dcf6ee6\") " pod="metallb-system/speaker-zm88x" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.842648 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-k2274" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.866537 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-mxkb8" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.892993 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-reloader\") pod \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.893089 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-startup\") pod \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.893179 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-metrics-certs\") pod \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.893285 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-sockets\") pod \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.893311 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2djw\" (UniqueName: \"kubernetes.io/projected/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-kube-api-access-t2djw\") pod \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.893344 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-metrics\") pod \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.893421 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-conf\") pod \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\" (UID: \"b922335a-fbf4-41ec-99b0-dafbd8b24bf5\") " Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.893894 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-reloader" (OuterVolumeSpecName: "reloader") pod "b922335a-fbf4-41ec-99b0-dafbd8b24bf5" (UID: "b922335a-fbf4-41ec-99b0-dafbd8b24bf5"). InnerVolumeSpecName "reloader". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.894382 4764 reconciler_common.go:293] "Volume detached for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-reloader\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.894764 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-conf" (OuterVolumeSpecName: "frr-conf") pod "b922335a-fbf4-41ec-99b0-dafbd8b24bf5" (UID: "b922335a-fbf4-41ec-99b0-dafbd8b24bf5"). InnerVolumeSpecName "frr-conf". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.897478 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-startup" (OuterVolumeSpecName: "frr-startup") pod "b922335a-fbf4-41ec-99b0-dafbd8b24bf5" (UID: "b922335a-fbf4-41ec-99b0-dafbd8b24bf5"). InnerVolumeSpecName "frr-startup". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.897690 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-sockets" (OuterVolumeSpecName: "frr-sockets") pod "b922335a-fbf4-41ec-99b0-dafbd8b24bf5" (UID: "b922335a-fbf4-41ec-99b0-dafbd8b24bf5"). InnerVolumeSpecName "frr-sockets". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.897746 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "b922335a-fbf4-41ec-99b0-dafbd8b24bf5" (UID: "b922335a-fbf4-41ec-99b0-dafbd8b24bf5"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.897901 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-kube-api-access-t2djw" (OuterVolumeSpecName: "kube-api-access-t2djw") pod "b922335a-fbf4-41ec-99b0-dafbd8b24bf5" (UID: "b922335a-fbf4-41ec-99b0-dafbd8b24bf5"). InnerVolumeSpecName "kube-api-access-t2djw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.900853 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-metrics" (OuterVolumeSpecName: "metrics") pod "b922335a-fbf4-41ec-99b0-dafbd8b24bf5" (UID: "b922335a-fbf4-41ec-99b0-dafbd8b24bf5"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.996019 4764 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.996065 4764 reconciler_common.go:293] "Volume detached for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-conf\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.996079 4764 reconciler_common.go:293] "Volume detached for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-startup\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.996090 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.996103 4764 reconciler_common.go:293] "Volume detached for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-frr-sockets\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:52 crc kubenswrapper[4764]: I1001 16:27:52.996116 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2djw\" (UniqueName: \"kubernetes.io/projected/b922335a-fbf4-41ec-99b0-dafbd8b24bf5-kube-api-access-t2djw\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.059591 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zm88x" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.102732 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dq7mc" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.122176 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-mxkb8"] Oct 01 16:27:53 crc kubenswrapper[4764]: W1001 16:27:53.123629 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e96a209_b683_4060_8c9f_b7b1ae8c89b0.slice/crio-398c0e1dfadc32632a2e9e11fd46a39c376e26596e0bca9b1e5f1e5d853b6bde WatchSource:0}: Error finding container 398c0e1dfadc32632a2e9e11fd46a39c376e26596e0bca9b1e5f1e5d853b6bde: Status 404 returned error can't find the container with id 398c0e1dfadc32632a2e9e11fd46a39c376e26596e0bca9b1e5f1e5d853b6bde Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.156263 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff"] Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.160306 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k2274" event={"ID":"d7c7ca03-94fe-4d3b-914b-669bfd41d526","Type":"ContainerStarted","Data":"1d19f525846baef48519632583d9844d1e23659fe1934a16a593ea292d81407b"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.168463 4764 generic.go:334] "Generic (PLEG): container finished" podID="5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" containerID="74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1" exitCode=0 Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.168488 4764 generic.go:334] "Generic (PLEG): container finished" podID="5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" containerID="97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a" exitCode=0 Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.168559 4764 scope.go:117] "RemoveContainer" containerID="74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.168606 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-dq7mc" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.178212 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zm88x" event={"ID":"0bf73b76-19c9-4264-abe9-80d24dcf6ee6","Type":"ContainerStarted","Data":"25d0ee59055205fddb0db99a856938a7a3f950abfbcb3b767c9fe5cfade450be"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.194844 4764 generic.go:334] "Generic (PLEG): container finished" podID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerID="2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf" exitCode=0 Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.194896 4764 generic.go:334] "Generic (PLEG): container finished" podID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerID="e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d" exitCode=0 Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.194906 4764 generic.go:334] "Generic (PLEG): container finished" podID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerID="2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0" exitCode=143 Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.194918 4764 generic.go:334] "Generic (PLEG): container finished" podID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerID="41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76" exitCode=0 Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.194927 4764 generic.go:334] "Generic (PLEG): container finished" podID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerID="e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094" exitCode=143 Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.194949 4764 generic.go:334] "Generic (PLEG): container finished" podID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" containerID="5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328" exitCode=0 Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195002 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6tr29" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195147 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195167 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195174 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195181 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195189 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195196 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195212 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195218 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195225 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195235 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195241 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195247 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195253 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195260 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195266 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195272 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195278 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.195285 4764 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.197642 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-mxkb8" event={"ID":"1e96a209-b683-4060-8c9f-b7b1ae8c89b0","Type":"ContainerStarted","Data":"398c0e1dfadc32632a2e9e11fd46a39c376e26596e0bca9b1e5f1e5d853b6bde"} Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.198352 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metrics-certs\") pod \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.198704 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-memberlist\") pod \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.198851 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t79mc\" (UniqueName: \"kubernetes.io/projected/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-kube-api-access-t79mc\") pod \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.198886 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metallb-excludel2\") pod \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\" (UID: \"5b219b21-c1fa-4af1-9d7a-ad0f00c987b3\") " Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.203482 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metallb-excludel2" (OuterVolumeSpecName: "metallb-excludel2") pod "5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" (UID: "5b219b21-c1fa-4af1-9d7a-ad0f00c987b3"). InnerVolumeSpecName "metallb-excludel2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.206655 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-kube-api-access-t79mc" (OuterVolumeSpecName: "kube-api-access-t79mc") pod "5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" (UID: "5b219b21-c1fa-4af1-9d7a-ad0f00c987b3"). InnerVolumeSpecName "kube-api-access-t79mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.206656 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-memberlist" (OuterVolumeSpecName: "memberlist") pod "5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" (UID: "5b219b21-c1fa-4af1-9d7a-ad0f00c987b3"). InnerVolumeSpecName "memberlist". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.208199 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" (UID: "5b219b21-c1fa-4af1-9d7a-ad0f00c987b3"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.225932 4764 scope.go:117] "RemoveContainer" containerID="97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.253730 4764 scope.go:117] "RemoveContainer" containerID="74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1" Oct 01 16:27:53 crc kubenswrapper[4764]: E1001 16:27:53.254402 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1\": container with ID starting with 74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1 not found: ID does not exist" containerID="74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.254454 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1"} err="failed to get container status \"74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1\": rpc error: code = NotFound desc = could not find container \"74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1\": container with ID starting with 74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.254486 4764 scope.go:117] "RemoveContainer" containerID="97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a" Oct 01 16:27:53 crc kubenswrapper[4764]: E1001 16:27:53.254834 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a\": container with ID starting with 97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a not found: ID does not exist" containerID="97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.254868 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a"} err="failed to get container status \"97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a\": rpc error: code = NotFound desc = could not find container \"97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a\": container with ID starting with 97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.254899 4764 scope.go:117] "RemoveContainer" containerID="74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.255284 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1"} err="failed to get container status \"74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1\": rpc error: code = NotFound desc = could not find container \"74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1\": container with ID starting with 74c59bf92c315fbc9ca562a69a373ac24323c13f4d953855bc4a668938aa60e1 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.255312 4764 scope.go:117] "RemoveContainer" containerID="97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.255537 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a"} err="failed to get container status \"97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a\": rpc error: code = NotFound desc = could not find container \"97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a\": container with ID starting with 97cdebefd7e46f4a95c650f2a3ae406a351e4d2a3174113e799efb0c864a389a not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.255558 4764 scope.go:117] "RemoveContainer" containerID="2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.286350 4764 scope.go:117] "RemoveContainer" containerID="e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.301609 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t79mc\" (UniqueName: \"kubernetes.io/projected/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-kube-api-access-t79mc\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.301641 4764 reconciler_common.go:293] "Volume detached for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metallb-excludel2\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.301651 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.301660 4764 reconciler_common.go:293] "Volume detached for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3-memberlist\") on node \"crc\" DevicePath \"\"" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.312358 4764 scope.go:117] "RemoveContainer" containerID="2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.375084 4764 scope.go:117] "RemoveContainer" containerID="41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.408964 4764 scope.go:117] "RemoveContainer" containerID="e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.427418 4764 scope.go:117] "RemoveContainer" containerID="5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.461397 4764 scope.go:117] "RemoveContainer" containerID="62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.564570 4764 scope.go:117] "RemoveContainer" containerID="019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.621111 4764 scope.go:117] "RemoveContainer" containerID="94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.677224 4764 scope.go:117] "RemoveContainer" containerID="2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf" Oct 01 16:27:53 crc kubenswrapper[4764]: E1001 16:27:53.677677 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf\": container with ID starting with 2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf not found: ID does not exist" containerID="2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.677701 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf"} err="failed to get container status \"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf\": rpc error: code = NotFound desc = could not find container \"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf\": container with ID starting with 2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.677721 4764 scope.go:117] "RemoveContainer" containerID="e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d" Oct 01 16:27:53 crc kubenswrapper[4764]: E1001 16:27:53.678302 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d\": container with ID starting with e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d not found: ID does not exist" containerID="e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.678328 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d"} err="failed to get container status \"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d\": rpc error: code = NotFound desc = could not find container \"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d\": container with ID starting with e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.678342 4764 scope.go:117] "RemoveContainer" containerID="2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0" Oct 01 16:27:53 crc kubenswrapper[4764]: E1001 16:27:53.678679 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0\": container with ID starting with 2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0 not found: ID does not exist" containerID="2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.678707 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0"} err="failed to get container status \"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0\": rpc error: code = NotFound desc = could not find container \"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0\": container with ID starting with 2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.678725 4764 scope.go:117] "RemoveContainer" containerID="41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76" Oct 01 16:27:53 crc kubenswrapper[4764]: E1001 16:27:53.679333 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76\": container with ID starting with 41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76 not found: ID does not exist" containerID="41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.679355 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76"} err="failed to get container status \"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76\": rpc error: code = NotFound desc = could not find container \"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76\": container with ID starting with 41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.679368 4764 scope.go:117] "RemoveContainer" containerID="e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094" Oct 01 16:27:53 crc kubenswrapper[4764]: E1001 16:27:53.679682 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094\": container with ID starting with e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094 not found: ID does not exist" containerID="e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.679701 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094"} err="failed to get container status \"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094\": rpc error: code = NotFound desc = could not find container \"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094\": container with ID starting with e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.679717 4764 scope.go:117] "RemoveContainer" containerID="5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328" Oct 01 16:27:53 crc kubenswrapper[4764]: E1001 16:27:53.680017 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328\": container with ID starting with 5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328 not found: ID does not exist" containerID="5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.680040 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328"} err="failed to get container status \"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328\": rpc error: code = NotFound desc = could not find container \"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328\": container with ID starting with 5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.680067 4764 scope.go:117] "RemoveContainer" containerID="62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3" Oct 01 16:27:53 crc kubenswrapper[4764]: E1001 16:27:53.681327 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3\": container with ID starting with 62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3 not found: ID does not exist" containerID="62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.681351 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3"} err="failed to get container status \"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3\": rpc error: code = NotFound desc = could not find container \"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3\": container with ID starting with 62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.681370 4764 scope.go:117] "RemoveContainer" containerID="019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f" Oct 01 16:27:53 crc kubenswrapper[4764]: E1001 16:27:53.681744 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f\": container with ID starting with 019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f not found: ID does not exist" containerID="019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.681765 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f"} err="failed to get container status \"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f\": rpc error: code = NotFound desc = could not find container \"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f\": container with ID starting with 019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.681777 4764 scope.go:117] "RemoveContainer" containerID="94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b" Oct 01 16:27:53 crc kubenswrapper[4764]: E1001 16:27:53.682098 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b\": container with ID starting with 94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b not found: ID does not exist" containerID="94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.682127 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b"} err="failed to get container status \"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b\": rpc error: code = NotFound desc = could not find container \"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b\": container with ID starting with 94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.682145 4764 scope.go:117] "RemoveContainer" containerID="2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.682414 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf"} err="failed to get container status \"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf\": rpc error: code = NotFound desc = could not find container \"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf\": container with ID starting with 2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.682435 4764 scope.go:117] "RemoveContainer" containerID="e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.682685 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d"} err="failed to get container status \"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d\": rpc error: code = NotFound desc = could not find container \"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d\": container with ID starting with e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.682701 4764 scope.go:117] "RemoveContainer" containerID="2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.682962 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0"} err="failed to get container status \"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0\": rpc error: code = NotFound desc = could not find container \"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0\": container with ID starting with 2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.682979 4764 scope.go:117] "RemoveContainer" containerID="41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.683214 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76"} err="failed to get container status \"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76\": rpc error: code = NotFound desc = could not find container \"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76\": container with ID starting with 41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.683229 4764 scope.go:117] "RemoveContainer" containerID="e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.683434 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094"} err="failed to get container status \"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094\": rpc error: code = NotFound desc = could not find container \"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094\": container with ID starting with e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.683451 4764 scope.go:117] "RemoveContainer" containerID="5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.683677 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328"} err="failed to get container status \"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328\": rpc error: code = NotFound desc = could not find container \"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328\": container with ID starting with 5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.683693 4764 scope.go:117] "RemoveContainer" containerID="62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.683981 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3"} err="failed to get container status \"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3\": rpc error: code = NotFound desc = could not find container \"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3\": container with ID starting with 62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.684002 4764 scope.go:117] "RemoveContainer" containerID="019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.684222 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f"} err="failed to get container status \"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f\": rpc error: code = NotFound desc = could not find container \"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f\": container with ID starting with 019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.684241 4764 scope.go:117] "RemoveContainer" containerID="94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.684729 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b"} err="failed to get container status \"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b\": rpc error: code = NotFound desc = could not find container \"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b\": container with ID starting with 94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.684748 4764 scope.go:117] "RemoveContainer" containerID="2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.684999 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf"} err="failed to get container status \"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf\": rpc error: code = NotFound desc = could not find container \"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf\": container with ID starting with 2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.685014 4764 scope.go:117] "RemoveContainer" containerID="e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.685326 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d"} err="failed to get container status \"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d\": rpc error: code = NotFound desc = could not find container \"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d\": container with ID starting with e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.685345 4764 scope.go:117] "RemoveContainer" containerID="2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.685525 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0"} err="failed to get container status \"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0\": rpc error: code = NotFound desc = could not find container \"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0\": container with ID starting with 2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.685540 4764 scope.go:117] "RemoveContainer" containerID="41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.685829 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76"} err="failed to get container status \"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76\": rpc error: code = NotFound desc = could not find container \"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76\": container with ID starting with 41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.685846 4764 scope.go:117] "RemoveContainer" containerID="e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.686169 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094"} err="failed to get container status \"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094\": rpc error: code = NotFound desc = could not find container \"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094\": container with ID starting with e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.686186 4764 scope.go:117] "RemoveContainer" containerID="5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.686394 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328"} err="failed to get container status \"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328\": rpc error: code = NotFound desc = could not find container \"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328\": container with ID starting with 5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.686412 4764 scope.go:117] "RemoveContainer" containerID="62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.686627 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3"} err="failed to get container status \"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3\": rpc error: code = NotFound desc = could not find container \"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3\": container with ID starting with 62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.686644 4764 scope.go:117] "RemoveContainer" containerID="019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.686820 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f"} err="failed to get container status \"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f\": rpc error: code = NotFound desc = could not find container \"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f\": container with ID starting with 019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.686836 4764 scope.go:117] "RemoveContainer" containerID="94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.687030 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b"} err="failed to get container status \"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b\": rpc error: code = NotFound desc = could not find container \"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b\": container with ID starting with 94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.687060 4764 scope.go:117] "RemoveContainer" containerID="2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.687594 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf"} err="failed to get container status \"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf\": rpc error: code = NotFound desc = could not find container \"2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf\": container with ID starting with 2911d201b735bf272677929f3f15f888837192dacca37041af81ac54e87948bf not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.687612 4764 scope.go:117] "RemoveContainer" containerID="e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.687892 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d"} err="failed to get container status \"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d\": rpc error: code = NotFound desc = could not find container \"e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d\": container with ID starting with e2a71670f64b2de9e147f994d783d8a9896538c3a7dc1c6a92748f0b57980f9d not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.687913 4764 scope.go:117] "RemoveContainer" containerID="2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.688237 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0"} err="failed to get container status \"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0\": rpc error: code = NotFound desc = could not find container \"2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0\": container with ID starting with 2af13d4fd3f43bfce462ae8f18c4592494233f9cd2d05c5661612271014f7ad0 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.688259 4764 scope.go:117] "RemoveContainer" containerID="41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.688496 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76"} err="failed to get container status \"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76\": rpc error: code = NotFound desc = could not find container \"41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76\": container with ID starting with 41ab92c4fc82e6a00c2f960cc39dc457786c3140edbb8e6676c8493aeb745f76 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.688525 4764 scope.go:117] "RemoveContainer" containerID="e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.688751 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094"} err="failed to get container status \"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094\": rpc error: code = NotFound desc = could not find container \"e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094\": container with ID starting with e8109d736aa6dce94fbcf8edd1000a1ebee7bd543fed8cd5bc795f5b1ad34094 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.688772 4764 scope.go:117] "RemoveContainer" containerID="5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.688973 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328"} err="failed to get container status \"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328\": rpc error: code = NotFound desc = could not find container \"5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328\": container with ID starting with 5f8ffe3c7516ff311358bca4e9f6a62129ca8ebe32f7d849d58314b5365fe328 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.688994 4764 scope.go:117] "RemoveContainer" containerID="62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.689325 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3"} err="failed to get container status \"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3\": rpc error: code = NotFound desc = could not find container \"62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3\": container with ID starting with 62253c348f786ee49876e9b74756ad6e4730d06a41fd6bb8677cff50723149d3 not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.689343 4764 scope.go:117] "RemoveContainer" containerID="019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.692202 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f"} err="failed to get container status \"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f\": rpc error: code = NotFound desc = could not find container \"019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f\": container with ID starting with 019b7cd14c51af44740ffffaeaa5281cb8d091243f3926e0ea94794fd958818f not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.692256 4764 scope.go:117] "RemoveContainer" containerID="94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.692700 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b"} err="failed to get container status \"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b\": rpc error: code = NotFound desc = could not find container \"94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b\": container with ID starting with 94f2a96a8075ae3504c832067d063e62d46824584a9fad5dfd375c1d59515d2b not found: ID does not exist" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.738196 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b219b21-c1fa-4af1-9d7a-ad0f00c987b3" path="/var/lib/kubelet/pods/5b219b21-c1fa-4af1-9d7a-ad0f00c987b3/volumes" Oct 01 16:27:53 crc kubenswrapper[4764]: I1001 16:27:53.738873 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b922335a-fbf4-41ec-99b0-dafbd8b24bf5" path="/var/lib/kubelet/pods/b922335a-fbf4-41ec-99b0-dafbd8b24bf5/volumes" Oct 01 16:27:54 crc kubenswrapper[4764]: I1001 16:27:54.212379 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zm88x" event={"ID":"0bf73b76-19c9-4264-abe9-80d24dcf6ee6","Type":"ContainerStarted","Data":"56648a90fca2b47c4e259a4961e4e8b51c93233926ba65dfc7f83f332ee3e593"} Oct 01 16:27:54 crc kubenswrapper[4764]: I1001 16:27:54.212781 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zm88x" event={"ID":"0bf73b76-19c9-4264-abe9-80d24dcf6ee6","Type":"ContainerStarted","Data":"c648440fa2d86b7b7adb1ad25cbe7d1ea9883813203bf8a4b263962e671bac76"} Oct 01 16:27:54 crc kubenswrapper[4764]: I1001 16:27:54.212817 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zm88x" Oct 01 16:27:54 crc kubenswrapper[4764]: I1001 16:27:54.217377 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff" event={"ID":"247af8f1-5e4b-4e17-9d31-055bdce2a1d6","Type":"ContainerStarted","Data":"626ae1726393136cfa0645300e5c9c36852fb025ed13c5e84f95d5e456da94ef"} Oct 01 16:27:54 crc kubenswrapper[4764]: I1001 16:27:54.219151 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-mxkb8" event={"ID":"1e96a209-b683-4060-8c9f-b7b1ae8c89b0","Type":"ContainerStarted","Data":"3d706eb96b6846c337d22fd9bc114778e5775f1d60877cf8dcbbfa61b0b59532"} Oct 01 16:27:54 crc kubenswrapper[4764]: I1001 16:27:54.219185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-mxkb8" event={"ID":"1e96a209-b683-4060-8c9f-b7b1ae8c89b0","Type":"ContainerStarted","Data":"c4b05da4530327b37a718d8c5a1673ac3921e815d029d6d6c29409a0f6db143a"} Oct 01 16:27:54 crc kubenswrapper[4764]: I1001 16:27:54.219346 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-mxkb8" Oct 01 16:27:54 crc kubenswrapper[4764]: I1001 16:27:54.243939 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zm88x" podStartSLOduration=2.243920712 podStartE2EDuration="2.243920712s" podCreationTimestamp="2025-10-01 16:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:27:54.23518267 +0000 UTC m=+1537.234829515" watchObservedRunningTime="2025-10-01 16:27:54.243920712 +0000 UTC m=+1537.243567547" Oct 01 16:27:54 crc kubenswrapper[4764]: I1001 16:27:54.258560 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-mxkb8" podStartSLOduration=2.258520386 podStartE2EDuration="2.258520386s" podCreationTimestamp="2025-10-01 16:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:27:54.252729735 +0000 UTC m=+1537.252376610" watchObservedRunningTime="2025-10-01 16:27:54.258520386 +0000 UTC m=+1537.258167261" Oct 01 16:28:00 crc kubenswrapper[4764]: I1001 16:28:00.291287 4764 generic.go:334] "Generic (PLEG): container finished" podID="d7c7ca03-94fe-4d3b-914b-669bfd41d526" containerID="711f7eecb8b7c3f86f3ada50f54698fe6a38a6cc8887c605ab0030351b2849b2" exitCode=0 Oct 01 16:28:00 crc kubenswrapper[4764]: I1001 16:28:00.291365 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k2274" event={"ID":"d7c7ca03-94fe-4d3b-914b-669bfd41d526","Type":"ContainerDied","Data":"711f7eecb8b7c3f86f3ada50f54698fe6a38a6cc8887c605ab0030351b2849b2"} Oct 01 16:28:00 crc kubenswrapper[4764]: I1001 16:28:00.295087 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff" event={"ID":"247af8f1-5e4b-4e17-9d31-055bdce2a1d6","Type":"ContainerStarted","Data":"135f144746bc6ee22d3b52ffb20ea17dfc28e3ec60cee026d10983342bbca2ba"} Oct 01 16:28:00 crc kubenswrapper[4764]: I1001 16:28:00.295409 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff" Oct 01 16:28:00 crc kubenswrapper[4764]: I1001 16:28:00.336926 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff" podStartSLOduration=2.240844357 podStartE2EDuration="8.336901496s" podCreationTimestamp="2025-10-01 16:27:52 +0000 UTC" firstStartedPulling="2025-10-01 16:27:53.180372088 +0000 UTC m=+1536.180018923" lastFinishedPulling="2025-10-01 16:27:59.276429237 +0000 UTC m=+1542.276076062" observedRunningTime="2025-10-01 16:28:00.324370432 +0000 UTC m=+1543.324017267" watchObservedRunningTime="2025-10-01 16:28:00.336901496 +0000 UTC m=+1543.336548331" Oct 01 16:28:01 crc kubenswrapper[4764]: I1001 16:28:01.310197 4764 generic.go:334] "Generic (PLEG): container finished" podID="d7c7ca03-94fe-4d3b-914b-669bfd41d526" containerID="8072aaaf5ac693dbce8fb4b5f998d3abc492afa2b85049324a3be2df3c0b91f6" exitCode=0 Oct 01 16:28:01 crc kubenswrapper[4764]: I1001 16:28:01.311277 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k2274" event={"ID":"d7c7ca03-94fe-4d3b-914b-669bfd41d526","Type":"ContainerDied","Data":"8072aaaf5ac693dbce8fb4b5f998d3abc492afa2b85049324a3be2df3c0b91f6"} Oct 01 16:28:02 crc kubenswrapper[4764]: I1001 16:28:02.027883 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8c9sc"] Oct 01 16:28:02 crc kubenswrapper[4764]: I1001 16:28:02.038500 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8c9sc"] Oct 01 16:28:02 crc kubenswrapper[4764]: I1001 16:28:02.322367 4764 generic.go:334] "Generic (PLEG): container finished" podID="d7c7ca03-94fe-4d3b-914b-669bfd41d526" containerID="51b3ce7bf8ec0380ac35a663da79a60e94a6277a4502b4a88bdd8a9cda4a1b24" exitCode=0 Oct 01 16:28:02 crc kubenswrapper[4764]: I1001 16:28:02.322414 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k2274" event={"ID":"d7c7ca03-94fe-4d3b-914b-669bfd41d526","Type":"ContainerDied","Data":"51b3ce7bf8ec0380ac35a663da79a60e94a6277a4502b4a88bdd8a9cda4a1b24"} Oct 01 16:28:03 crc kubenswrapper[4764]: I1001 16:28:03.065576 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zm88x" Oct 01 16:28:03 crc kubenswrapper[4764]: I1001 16:28:03.337959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k2274" event={"ID":"d7c7ca03-94fe-4d3b-914b-669bfd41d526","Type":"ContainerStarted","Data":"5c22ed5912a87779dbe35fbd6450c15cb6c7bdcb53cd72d0f5ada884f33a0cb7"} Oct 01 16:28:03 crc kubenswrapper[4764]: I1001 16:28:03.338006 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k2274" event={"ID":"d7c7ca03-94fe-4d3b-914b-669bfd41d526","Type":"ContainerStarted","Data":"87b113cf81663b9ab724255ca422b080d66214bb7a45f324875dad6c1661f3a2"} Oct 01 16:28:03 crc kubenswrapper[4764]: I1001 16:28:03.338019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k2274" event={"ID":"d7c7ca03-94fe-4d3b-914b-669bfd41d526","Type":"ContainerStarted","Data":"aa7131e55e5301129a7f54479f6a50132b4d2ad9ecae1c0be9986f28d32c1732"} Oct 01 16:28:03 crc kubenswrapper[4764]: I1001 16:28:03.338032 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k2274" event={"ID":"d7c7ca03-94fe-4d3b-914b-669bfd41d526","Type":"ContainerStarted","Data":"3ff29c4e43284dbcb7f070d00a3b4ba5e2b1f2a4c4740e420f01a71fa85cd714"} Oct 01 16:28:03 crc kubenswrapper[4764]: I1001 16:28:03.736630 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="709ae6b5-8376-41e5-9d86-d6bf54ea147b" path="/var/lib/kubelet/pods/709ae6b5-8376-41e5-9d86-d6bf54ea147b/volumes" Oct 01 16:28:03 crc kubenswrapper[4764]: I1001 16:28:03.823257 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-755d79df97-5b9ck" Oct 01 16:28:03 crc kubenswrapper[4764]: I1001 16:28:03.906849 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw"] Oct 01 16:28:03 crc kubenswrapper[4764]: I1001 16:28:03.908070 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" podUID="ca2c1940-6297-4207-b2d1-16d5fcb28d5e" containerName="webhook-server" containerID="cri-o://08138863923be7b2ac1dd93bc5e1f2422461d6ce0da0271687ff1681c757d747" gracePeriod=2 Oct 01 16:28:03 crc kubenswrapper[4764]: I1001 16:28:03.922113 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw"] Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.380636 4764 generic.go:334] "Generic (PLEG): container finished" podID="ca2c1940-6297-4207-b2d1-16d5fcb28d5e" containerID="08138863923be7b2ac1dd93bc5e1f2422461d6ce0da0271687ff1681c757d747" exitCode=0 Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.380953 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2332cda9153c00371a1ef45f2e620b27e1940a593941bc27ad2f11f228eb3d11" Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.412469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k2274" event={"ID":"d7c7ca03-94fe-4d3b-914b-669bfd41d526","Type":"ContainerStarted","Data":"1d4d08c446639df04cc4e3673f3bace37c5f35535af0fb1d7af519fd564feed6"} Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.412512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k2274" event={"ID":"d7c7ca03-94fe-4d3b-914b-669bfd41d526","Type":"ContainerStarted","Data":"80dfdef75a61847457eb5b84ec4da758e1bbd0493651ab48b5da31282143bf5e"} Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.413818 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-k2274" Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.422260 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.474089 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-k2274" podStartSLOduration=6.227071126 podStartE2EDuration="12.474072755s" podCreationTimestamp="2025-10-01 16:27:52 +0000 UTC" firstStartedPulling="2025-10-01 16:27:52.984195692 +0000 UTC m=+1535.983842527" lastFinishedPulling="2025-10-01 16:27:59.231197321 +0000 UTC m=+1542.230844156" observedRunningTime="2025-10-01 16:28:04.471963424 +0000 UTC m=+1547.471610259" watchObservedRunningTime="2025-10-01 16:28:04.474072755 +0000 UTC m=+1547.473719590" Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.532842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-apiservice-cert\") pod \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\" (UID: \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\") " Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.532894 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wbn6\" (UniqueName: \"kubernetes.io/projected/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-kube-api-access-5wbn6\") pod \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\" (UID: \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\") " Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.533040 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-webhook-cert\") pod \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\" (UID: \"ca2c1940-6297-4207-b2d1-16d5fcb28d5e\") " Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.539470 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "ca2c1940-6297-4207-b2d1-16d5fcb28d5e" (UID: "ca2c1940-6297-4207-b2d1-16d5fcb28d5e"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.545751 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "ca2c1940-6297-4207-b2d1-16d5fcb28d5e" (UID: "ca2c1940-6297-4207-b2d1-16d5fcb28d5e"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.545784 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-kube-api-access-5wbn6" (OuterVolumeSpecName: "kube-api-access-5wbn6") pod "ca2c1940-6297-4207-b2d1-16d5fcb28d5e" (UID: "ca2c1940-6297-4207-b2d1-16d5fcb28d5e"). InnerVolumeSpecName "kube-api-access-5wbn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.635869 4764 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.635916 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wbn6\" (UniqueName: \"kubernetes.io/projected/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-kube-api-access-5wbn6\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:04 crc kubenswrapper[4764]: I1001 16:28:04.635931 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca2c1940-6297-4207-b2d1-16d5fcb28d5e-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:05 crc kubenswrapper[4764]: I1001 16:28:05.332429 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" podUID="ca2c1940-6297-4207-b2d1-16d5fcb28d5e" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.205:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 16:28:05 crc kubenswrapper[4764]: I1001 16:28:05.423396 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74c8b6564b-hsdqw" Oct 01 16:28:05 crc kubenswrapper[4764]: I1001 16:28:05.736680 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca2c1940-6297-4207-b2d1-16d5fcb28d5e" path="/var/lib/kubelet/pods/ca2c1940-6297-4207-b2d1-16d5fcb28d5e/volumes" Oct 01 16:28:06 crc kubenswrapper[4764]: I1001 16:28:06.048767 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-6thlp"] Oct 01 16:28:06 crc kubenswrapper[4764]: I1001 16:28:06.071844 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-n85kj"] Oct 01 16:28:06 crc kubenswrapper[4764]: I1001 16:28:06.083545 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-n85kj"] Oct 01 16:28:06 crc kubenswrapper[4764]: I1001 16:28:06.096718 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-6thlp"] Oct 01 16:28:07 crc kubenswrapper[4764]: I1001 16:28:07.739643 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06cb65e0-0860-4c2e-87e4-45216c0c3f9f" path="/var/lib/kubelet/pods/06cb65e0-0860-4c2e-87e4-45216c0c3f9f/volumes" Oct 01 16:28:07 crc kubenswrapper[4764]: I1001 16:28:07.740799 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f115db7c-e622-4427-9d2b-f504a5166394" path="/var/lib/kubelet/pods/f115db7c-e622-4427-9d2b-f504a5166394/volumes" Oct 01 16:28:07 crc kubenswrapper[4764]: I1001 16:28:07.843931 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-k2274" Oct 01 16:28:07 crc kubenswrapper[4764]: I1001 16:28:07.914111 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-k2274" Oct 01 16:28:11 crc kubenswrapper[4764]: I1001 16:28:11.028802 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8220-account-create-qbvl5"] Oct 01 16:28:11 crc kubenswrapper[4764]: I1001 16:28:11.040416 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8220-account-create-qbvl5"] Oct 01 16:28:11 crc kubenswrapper[4764]: I1001 16:28:11.736391 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13af3cfe-d62e-4fa0-8444-36ec829d18fa" path="/var/lib/kubelet/pods/13af3cfe-d62e-4fa0-8444-36ec829d18fa/volumes" Oct 01 16:28:12 crc kubenswrapper[4764]: I1001 16:28:12.714311 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-bhdff" Oct 01 16:28:12 crc kubenswrapper[4764]: I1001 16:28:12.779041 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8"] Oct 01 16:28:12 crc kubenswrapper[4764]: I1001 16:28:12.779286 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" podUID="6764cb75-6b7f-45aa-a7b7-6347f50022f7" containerName="frr-k8s-webhook-server" containerID="cri-o://22320a8a7ec765011456e3e067bc479aba9a9bde038d8280fde82ad3c498cc4c" gracePeriod=10 Oct 01 16:28:12 crc kubenswrapper[4764]: I1001 16:28:12.847296 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-k2274" Oct 01 16:28:12 crc kubenswrapper[4764]: I1001 16:28:12.874654 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-mxkb8" Oct 01 16:28:12 crc kubenswrapper[4764]: I1001 16:28:12.939611 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["metallb-system/controller-5d688f5ffc-f8m22"] Oct 01 16:28:12 crc kubenswrapper[4764]: I1001 16:28:12.940254 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/controller-5d688f5ffc-f8m22" podUID="90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" containerName="controller" containerID="cri-o://a15bb4ffa2041e6f15271ed5b94b680a1ef79cc20d7619d3f5d6ced6a91409e1" gracePeriod=2 Oct 01 16:28:12 crc kubenswrapper[4764]: I1001 16:28:12.940397 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/controller-5d688f5ffc-f8m22" podUID="90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" containerName="kube-rbac-proxy" containerID="cri-o://e1f282d32435aeb8cdeea1239affa34936768bdc60c7d1d8e34946f10b152c13" gracePeriod=2 Oct 01 16:28:12 crc kubenswrapper[4764]: I1001 16:28:12.961796 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["metallb-system/controller-5d688f5ffc-f8m22"] Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.430372 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.518071 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trlhl\" (UniqueName: \"kubernetes.io/projected/6764cb75-6b7f-45aa-a7b7-6347f50022f7-kube-api-access-trlhl\") pod \"6764cb75-6b7f-45aa-a7b7-6347f50022f7\" (UID: \"6764cb75-6b7f-45aa-a7b7-6347f50022f7\") " Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.518289 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6764cb75-6b7f-45aa-a7b7-6347f50022f7-cert\") pod \"6764cb75-6b7f-45aa-a7b7-6347f50022f7\" (UID: \"6764cb75-6b7f-45aa-a7b7-6347f50022f7\") " Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.524758 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6764cb75-6b7f-45aa-a7b7-6347f50022f7-kube-api-access-trlhl" (OuterVolumeSpecName: "kube-api-access-trlhl") pod "6764cb75-6b7f-45aa-a7b7-6347f50022f7" (UID: "6764cb75-6b7f-45aa-a7b7-6347f50022f7"). InnerVolumeSpecName "kube-api-access-trlhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.525467 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6764cb75-6b7f-45aa-a7b7-6347f50022f7-cert" (OuterVolumeSpecName: "cert") pod "6764cb75-6b7f-45aa-a7b7-6347f50022f7" (UID: "6764cb75-6b7f-45aa-a7b7-6347f50022f7"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.531622 4764 generic.go:334] "Generic (PLEG): container finished" podID="6764cb75-6b7f-45aa-a7b7-6347f50022f7" containerID="22320a8a7ec765011456e3e067bc479aba9a9bde038d8280fde82ad3c498cc4c" exitCode=0 Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.531682 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.531694 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" event={"ID":"6764cb75-6b7f-45aa-a7b7-6347f50022f7","Type":"ContainerDied","Data":"22320a8a7ec765011456e3e067bc479aba9a9bde038d8280fde82ad3c498cc4c"} Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.531725 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8" event={"ID":"6764cb75-6b7f-45aa-a7b7-6347f50022f7","Type":"ContainerDied","Data":"fcaa3c0d7590cc820ca8bf799421f6c149bcdccf5f2aa209c4d3b17b5289243b"} Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.531746 4764 scope.go:117] "RemoveContainer" containerID="22320a8a7ec765011456e3e067bc479aba9a9bde038d8280fde82ad3c498cc4c" Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.534751 4764 generic.go:334] "Generic (PLEG): container finished" podID="90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" containerID="e1f282d32435aeb8cdeea1239affa34936768bdc60c7d1d8e34946f10b152c13" exitCode=0 Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.534776 4764 generic.go:334] "Generic (PLEG): container finished" podID="90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" containerID="a15bb4ffa2041e6f15271ed5b94b680a1ef79cc20d7619d3f5d6ced6a91409e1" exitCode=0 Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.572401 4764 scope.go:117] "RemoveContainer" containerID="22320a8a7ec765011456e3e067bc479aba9a9bde038d8280fde82ad3c498cc4c" Oct 01 16:28:13 crc kubenswrapper[4764]: E1001 16:28:13.573563 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22320a8a7ec765011456e3e067bc479aba9a9bde038d8280fde82ad3c498cc4c\": container with ID starting with 22320a8a7ec765011456e3e067bc479aba9a9bde038d8280fde82ad3c498cc4c not found: ID does not exist" containerID="22320a8a7ec765011456e3e067bc479aba9a9bde038d8280fde82ad3c498cc4c" Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.573617 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22320a8a7ec765011456e3e067bc479aba9a9bde038d8280fde82ad3c498cc4c"} err="failed to get container status \"22320a8a7ec765011456e3e067bc479aba9a9bde038d8280fde82ad3c498cc4c\": rpc error: code = NotFound desc = could not find container \"22320a8a7ec765011456e3e067bc479aba9a9bde038d8280fde82ad3c498cc4c\": container with ID starting with 22320a8a7ec765011456e3e067bc479aba9a9bde038d8280fde82ad3c498cc4c not found: ID does not exist" Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.579360 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8"] Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.586498 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-j2wj8"] Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.622861 4764 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6764cb75-6b7f-45aa-a7b7-6347f50022f7-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.622906 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trlhl\" (UniqueName: \"kubernetes.io/projected/6764cb75-6b7f-45aa-a7b7-6347f50022f7-kube-api-access-trlhl\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:13 crc kubenswrapper[4764]: I1001 16:28:13.753155 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6764cb75-6b7f-45aa-a7b7-6347f50022f7" path="/var/lib/kubelet/pods/6764cb75-6b7f-45aa-a7b7-6347f50022f7/volumes" Oct 01 16:28:14 crc kubenswrapper[4764]: I1001 16:28:14.180133 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:28:14 crc kubenswrapper[4764]: I1001 16:28:14.247418 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-cert\") pod \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\" (UID: \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\") " Oct 01 16:28:14 crc kubenswrapper[4764]: I1001 16:28:14.247583 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sxf9\" (UniqueName: \"kubernetes.io/projected/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-kube-api-access-7sxf9\") pod \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\" (UID: \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\") " Oct 01 16:28:14 crc kubenswrapper[4764]: I1001 16:28:14.247630 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-metrics-certs\") pod \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\" (UID: \"90ebc18d-5d97-4850-8c9e-2c0d71fcea0f\") " Oct 01 16:28:14 crc kubenswrapper[4764]: I1001 16:28:14.252375 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-cert" (OuterVolumeSpecName: "cert") pod "90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" (UID: "90ebc18d-5d97-4850-8c9e-2c0d71fcea0f"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:28:14 crc kubenswrapper[4764]: I1001 16:28:14.252424 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-kube-api-access-7sxf9" (OuterVolumeSpecName: "kube-api-access-7sxf9") pod "90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" (UID: "90ebc18d-5d97-4850-8c9e-2c0d71fcea0f"). InnerVolumeSpecName "kube-api-access-7sxf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:28:14 crc kubenswrapper[4764]: I1001 16:28:14.255950 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" (UID: "90ebc18d-5d97-4850-8c9e-2c0d71fcea0f"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:28:14 crc kubenswrapper[4764]: I1001 16:28:14.349728 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:14 crc kubenswrapper[4764]: I1001 16:28:14.349767 4764 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:14 crc kubenswrapper[4764]: I1001 16:28:14.349779 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sxf9\" (UniqueName: \"kubernetes.io/projected/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f-kube-api-access-7sxf9\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:14 crc kubenswrapper[4764]: I1001 16:28:14.552913 4764 scope.go:117] "RemoveContainer" containerID="e1f282d32435aeb8cdeea1239affa34936768bdc60c7d1d8e34946f10b152c13" Oct 01 16:28:14 crc kubenswrapper[4764]: I1001 16:28:14.553017 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-f8m22" Oct 01 16:28:14 crc kubenswrapper[4764]: I1001 16:28:14.594145 4764 scope.go:117] "RemoveContainer" containerID="a15bb4ffa2041e6f15271ed5b94b680a1ef79cc20d7619d3f5d6ced6a91409e1" Oct 01 16:28:15 crc kubenswrapper[4764]: I1001 16:28:15.055653 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9a9c-account-create-snkg4"] Oct 01 16:28:15 crc kubenswrapper[4764]: I1001 16:28:15.071246 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9a9c-account-create-snkg4"] Oct 01 16:28:15 crc kubenswrapper[4764]: I1001 16:28:15.749344 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05fa05d2-7e6c-4586-b5c3-712309bbebb5" path="/var/lib/kubelet/pods/05fa05d2-7e6c-4586-b5c3-712309bbebb5/volumes" Oct 01 16:28:15 crc kubenswrapper[4764]: I1001 16:28:15.750522 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" path="/var/lib/kubelet/pods/90ebc18d-5d97-4850-8c9e-2c0d71fcea0f/volumes" Oct 01 16:28:16 crc kubenswrapper[4764]: I1001 16:28:16.038646 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3be7-account-create-gbfns"] Oct 01 16:28:16 crc kubenswrapper[4764]: I1001 16:28:16.051476 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3be7-account-create-gbfns"] Oct 01 16:28:17 crc kubenswrapper[4764]: I1001 16:28:17.738209 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="185220ad-4140-4a41-b928-7b68e15408ba" path="/var/lib/kubelet/pods/185220ad-4140-4a41-b928-7b68e15408ba/volumes" Oct 01 16:28:23 crc kubenswrapper[4764]: I1001 16:28:23.477355 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-845477fbc7-z8qk5" Oct 01 16:28:23 crc kubenswrapper[4764]: I1001 16:28:23.575356 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x"] Oct 01 16:28:23 crc kubenswrapper[4764]: I1001 16:28:23.575605 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" podUID="08dc3cf9-a8da-48f8-bb23-77eb6740d676" containerName="manager" containerID="cri-o://d5a769fab2201d441444512d23b7c4d099250ae4f50b75e66a21520a0abc09f9" gracePeriod=10 Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.081723 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.161506 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08dc3cf9-a8da-48f8-bb23-77eb6740d676-webhook-cert\") pod \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\" (UID: \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\") " Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.161606 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08dc3cf9-a8da-48f8-bb23-77eb6740d676-apiservice-cert\") pod \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\" (UID: \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\") " Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.161647 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qlv6\" (UniqueName: \"kubernetes.io/projected/08dc3cf9-a8da-48f8-bb23-77eb6740d676-kube-api-access-9qlv6\") pod \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\" (UID: \"08dc3cf9-a8da-48f8-bb23-77eb6740d676\") " Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.169982 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08dc3cf9-a8da-48f8-bb23-77eb6740d676-kube-api-access-9qlv6" (OuterVolumeSpecName: "kube-api-access-9qlv6") pod "08dc3cf9-a8da-48f8-bb23-77eb6740d676" (UID: "08dc3cf9-a8da-48f8-bb23-77eb6740d676"). InnerVolumeSpecName "kube-api-access-9qlv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.170549 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08dc3cf9-a8da-48f8-bb23-77eb6740d676-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "08dc3cf9-a8da-48f8-bb23-77eb6740d676" (UID: "08dc3cf9-a8da-48f8-bb23-77eb6740d676"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.173202 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08dc3cf9-a8da-48f8-bb23-77eb6740d676-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "08dc3cf9-a8da-48f8-bb23-77eb6740d676" (UID: "08dc3cf9-a8da-48f8-bb23-77eb6740d676"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.263862 4764 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/08dc3cf9-a8da-48f8-bb23-77eb6740d676-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.264265 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qlv6\" (UniqueName: \"kubernetes.io/projected/08dc3cf9-a8da-48f8-bb23-77eb6740d676-kube-api-access-9qlv6\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.264355 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/08dc3cf9-a8da-48f8-bb23-77eb6740d676-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.667947 4764 generic.go:334] "Generic (PLEG): container finished" podID="08dc3cf9-a8da-48f8-bb23-77eb6740d676" containerID="d5a769fab2201d441444512d23b7c4d099250ae4f50b75e66a21520a0abc09f9" exitCode=0 Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.667989 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" event={"ID":"08dc3cf9-a8da-48f8-bb23-77eb6740d676","Type":"ContainerDied","Data":"d5a769fab2201d441444512d23b7c4d099250ae4f50b75e66a21520a0abc09f9"} Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.668030 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" event={"ID":"08dc3cf9-a8da-48f8-bb23-77eb6740d676","Type":"ContainerDied","Data":"37329a5537cabdb2ceff0964c171ff51dd5ae5f04b5c37d1680553df257a113f"} Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.668028 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.668065 4764 scope.go:117] "RemoveContainer" containerID="d5a769fab2201d441444512d23b7c4d099250ae4f50b75e66a21520a0abc09f9" Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.711652 4764 scope.go:117] "RemoveContainer" containerID="d5a769fab2201d441444512d23b7c4d099250ae4f50b75e66a21520a0abc09f9" Oct 01 16:28:24 crc kubenswrapper[4764]: E1001 16:28:24.712507 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5a769fab2201d441444512d23b7c4d099250ae4f50b75e66a21520a0abc09f9\": container with ID starting with d5a769fab2201d441444512d23b7c4d099250ae4f50b75e66a21520a0abc09f9 not found: ID does not exist" containerID="d5a769fab2201d441444512d23b7c4d099250ae4f50b75e66a21520a0abc09f9" Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.712572 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5a769fab2201d441444512d23b7c4d099250ae4f50b75e66a21520a0abc09f9"} err="failed to get container status \"d5a769fab2201d441444512d23b7c4d099250ae4f50b75e66a21520a0abc09f9\": rpc error: code = NotFound desc = could not find container \"d5a769fab2201d441444512d23b7c4d099250ae4f50b75e66a21520a0abc09f9\": container with ID starting with d5a769fab2201d441444512d23b7c4d099250ae4f50b75e66a21520a0abc09f9 not found: ID does not exist" Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.716988 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x"] Oct 01 16:28:24 crc kubenswrapper[4764]: I1001 16:28:24.726857 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x"] Oct 01 16:28:25 crc kubenswrapper[4764]: I1001 16:28:25.062531 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-56b6c48bf8-msr9x" podUID="08dc3cf9-a8da-48f8-bb23-77eb6740d676" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.204:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 16:28:25 crc kubenswrapper[4764]: I1001 16:28:25.734234 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08dc3cf9-a8da-48f8-bb23-77eb6740d676" path="/var/lib/kubelet/pods/08dc3cf9-a8da-48f8-bb23-77eb6740d676/volumes" Oct 01 16:28:34 crc kubenswrapper[4764]: I1001 16:28:34.768597 4764 generic.go:334] "Generic (PLEG): container finished" podID="a4b541d9-8d8f-4dfb-9c59-f00ba871257c" containerID="8a5921afb3a3713d040a30cb566f6eb2d40f370557ec018a3b780d0ffde91301" exitCode=0 Oct 01 16:28:34 crc kubenswrapper[4764]: I1001 16:28:34.768693 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" event={"ID":"a4b541d9-8d8f-4dfb-9c59-f00ba871257c","Type":"ContainerDied","Data":"8a5921afb3a3713d040a30cb566f6eb2d40f370557ec018a3b780d0ffde91301"} Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.137012 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.200344 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-inventory\") pod \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\" (UID: \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\") " Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.200495 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28wm2\" (UniqueName: \"kubernetes.io/projected/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-kube-api-access-28wm2\") pod \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\" (UID: \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\") " Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.200697 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-ssh-key\") pod \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\" (UID: \"a4b541d9-8d8f-4dfb-9c59-f00ba871257c\") " Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.206236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-kube-api-access-28wm2" (OuterVolumeSpecName: "kube-api-access-28wm2") pod "a4b541d9-8d8f-4dfb-9c59-f00ba871257c" (UID: "a4b541d9-8d8f-4dfb-9c59-f00ba871257c"). InnerVolumeSpecName "kube-api-access-28wm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.225202 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-inventory" (OuterVolumeSpecName: "inventory") pod "a4b541d9-8d8f-4dfb-9c59-f00ba871257c" (UID: "a4b541d9-8d8f-4dfb-9c59-f00ba871257c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.228919 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a4b541d9-8d8f-4dfb-9c59-f00ba871257c" (UID: "a4b541d9-8d8f-4dfb-9c59-f00ba871257c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.303307 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.303678 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.303698 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28wm2\" (UniqueName: \"kubernetes.io/projected/a4b541d9-8d8f-4dfb-9c59-f00ba871257c-kube-api-access-28wm2\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.789724 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" event={"ID":"a4b541d9-8d8f-4dfb-9c59-f00ba871257c","Type":"ContainerDied","Data":"fce90c6ca0cdc67d5fec2a63c67723d61c48ac1f2341ffc36389941294175bc1"} Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.789773 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fce90c6ca0cdc67d5fec2a63c67723d61c48ac1f2341ffc36389941294175bc1" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.789788 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.901151 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv"] Oct 01 16:28:36 crc kubenswrapper[4764]: E1001 16:28:36.901673 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6764cb75-6b7f-45aa-a7b7-6347f50022f7" containerName="frr-k8s-webhook-server" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.901690 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6764cb75-6b7f-45aa-a7b7-6347f50022f7" containerName="frr-k8s-webhook-server" Oct 01 16:28:36 crc kubenswrapper[4764]: E1001 16:28:36.901715 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" containerName="kube-rbac-proxy" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.901723 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" containerName="kube-rbac-proxy" Oct 01 16:28:36 crc kubenswrapper[4764]: E1001 16:28:36.901734 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08dc3cf9-a8da-48f8-bb23-77eb6740d676" containerName="manager" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.901743 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="08dc3cf9-a8da-48f8-bb23-77eb6740d676" containerName="manager" Oct 01 16:28:36 crc kubenswrapper[4764]: E1001 16:28:36.901760 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b541d9-8d8f-4dfb-9c59-f00ba871257c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.901771 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b541d9-8d8f-4dfb-9c59-f00ba871257c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 16:28:36 crc kubenswrapper[4764]: E1001 16:28:36.901788 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" containerName="controller" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.901796 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" containerName="controller" Oct 01 16:28:36 crc kubenswrapper[4764]: E1001 16:28:36.901818 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2c1940-6297-4207-b2d1-16d5fcb28d5e" containerName="webhook-server" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.901828 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2c1940-6297-4207-b2d1-16d5fcb28d5e" containerName="webhook-server" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.902132 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" containerName="controller" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.902152 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6764cb75-6b7f-45aa-a7b7-6347f50022f7" containerName="frr-k8s-webhook-server" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.902168 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="90ebc18d-5d97-4850-8c9e-2c0d71fcea0f" containerName="kube-rbac-proxy" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.902192 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="08dc3cf9-a8da-48f8-bb23-77eb6740d676" containerName="manager" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.902204 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b541d9-8d8f-4dfb-9c59-f00ba871257c" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.902220 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca2c1940-6297-4207-b2d1-16d5fcb28d5e" containerName="webhook-server" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.902909 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.907534 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.907650 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.907940 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.908182 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.913697 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8stpv\" (UID: \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.913745 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p6bw\" (UniqueName: \"kubernetes.io/projected/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-kube-api-access-6p6bw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8stpv\" (UID: \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.913776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8stpv\" (UID: \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" Oct 01 16:28:36 crc kubenswrapper[4764]: I1001 16:28:36.914457 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv"] Oct 01 16:28:36 crc kubenswrapper[4764]: E1001 16:28:36.997866 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4b541d9_8d8f_4dfb_9c59_f00ba871257c.slice\": RecentStats: unable to find data in memory cache]" Oct 01 16:28:37 crc kubenswrapper[4764]: I1001 16:28:37.023280 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8stpv\" (UID: \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" Oct 01 16:28:37 crc kubenswrapper[4764]: I1001 16:28:37.023348 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p6bw\" (UniqueName: \"kubernetes.io/projected/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-kube-api-access-6p6bw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8stpv\" (UID: \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" Oct 01 16:28:37 crc kubenswrapper[4764]: I1001 16:28:37.023403 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8stpv\" (UID: \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" Oct 01 16:28:37 crc kubenswrapper[4764]: I1001 16:28:37.030210 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8stpv\" (UID: \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" Oct 01 16:28:37 crc kubenswrapper[4764]: I1001 16:28:37.030624 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8stpv\" (UID: \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" Oct 01 16:28:37 crc kubenswrapper[4764]: I1001 16:28:37.044303 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p6bw\" (UniqueName: \"kubernetes.io/projected/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-kube-api-access-6p6bw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-8stpv\" (UID: \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" Oct 01 16:28:37 crc kubenswrapper[4764]: I1001 16:28:37.226260 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" Oct 01 16:28:37 crc kubenswrapper[4764]: I1001 16:28:37.839777 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv"] Oct 01 16:28:37 crc kubenswrapper[4764]: I1001 16:28:37.844675 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 16:28:38 crc kubenswrapper[4764]: I1001 16:28:38.824342 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" event={"ID":"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd","Type":"ContainerStarted","Data":"53964d6d55c8e7532c5b1597cd0a7ae1f0b2447543d8237d99a629cc579fe1d4"} Oct 01 16:28:39 crc kubenswrapper[4764]: I1001 16:28:39.835140 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" event={"ID":"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd","Type":"ContainerStarted","Data":"ee0a137efebff1555547d5a9673ffb7a7bb9d7af3f4c76c80829a25c9fe24fc0"} Oct 01 16:28:39 crc kubenswrapper[4764]: I1001 16:28:39.851664 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" podStartSLOduration=3.097023947 podStartE2EDuration="3.851648251s" podCreationTimestamp="2025-10-01 16:28:36 +0000 UTC" firstStartedPulling="2025-10-01 16:28:37.844286107 +0000 UTC m=+1580.843932962" lastFinishedPulling="2025-10-01 16:28:38.598910431 +0000 UTC m=+1581.598557266" observedRunningTime="2025-10-01 16:28:39.849238603 +0000 UTC m=+1582.848885438" watchObservedRunningTime="2025-10-01 16:28:39.851648251 +0000 UTC m=+1582.851295086" Oct 01 16:28:44 crc kubenswrapper[4764]: I1001 16:28:44.706655 4764 scope.go:117] "RemoveContainer" containerID="7a54dc2196e6680c4efe8571c163327d370e0d111e3b3fabc746991c92249694" Oct 01 16:28:44 crc kubenswrapper[4764]: I1001 16:28:44.749436 4764 scope.go:117] "RemoveContainer" containerID="3b066cbb5e42838720cdb004db4995687c2d1c3fbef00a5540c9c32ee17e5f35" Oct 01 16:28:44 crc kubenswrapper[4764]: I1001 16:28:44.832338 4764 scope.go:117] "RemoveContainer" containerID="579c3a582266a04e241f31f465caff6603393ea4e58292efd17966be91509da9" Oct 01 16:28:44 crc kubenswrapper[4764]: I1001 16:28:44.867235 4764 scope.go:117] "RemoveContainer" containerID="7bd6a26a8e132d970d352635b691d736a7f59bbf9f84345e6347c5f5a25870d2" Oct 01 16:28:44 crc kubenswrapper[4764]: I1001 16:28:44.899113 4764 generic.go:334] "Generic (PLEG): container finished" podID="1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd" containerID="ee0a137efebff1555547d5a9673ffb7a7bb9d7af3f4c76c80829a25c9fe24fc0" exitCode=0 Oct 01 16:28:44 crc kubenswrapper[4764]: I1001 16:28:44.899168 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" event={"ID":"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd","Type":"ContainerDied","Data":"ee0a137efebff1555547d5a9673ffb7a7bb9d7af3f4c76c80829a25c9fe24fc0"} Oct 01 16:28:44 crc kubenswrapper[4764]: I1001 16:28:44.905521 4764 scope.go:117] "RemoveContainer" containerID="3f4cdfb94c2237a5095e2964b9fb5d01e8886ac535dc4da381d022b373593f40" Oct 01 16:28:44 crc kubenswrapper[4764]: I1001 16:28:44.948456 4764 scope.go:117] "RemoveContainer" containerID="3277286fa513e00b97faffd5689150739eacd19aceef9423cbc6a24ff57b4888" Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.419418 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.525159 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p6bw\" (UniqueName: \"kubernetes.io/projected/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-kube-api-access-6p6bw\") pod \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\" (UID: \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\") " Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.525297 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-ssh-key\") pod \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\" (UID: \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\") " Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.526772 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-inventory\") pod \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\" (UID: \"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd\") " Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.532710 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-kube-api-access-6p6bw" (OuterVolumeSpecName: "kube-api-access-6p6bw") pod "1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd" (UID: "1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd"). InnerVolumeSpecName "kube-api-access-6p6bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.566319 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-inventory" (OuterVolumeSpecName: "inventory") pod "1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd" (UID: "1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.574899 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd" (UID: "1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.628978 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p6bw\" (UniqueName: \"kubernetes.io/projected/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-kube-api-access-6p6bw\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.629015 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.629025 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.923475 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" event={"ID":"1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd","Type":"ContainerDied","Data":"53964d6d55c8e7532c5b1597cd0a7ae1f0b2447543d8237d99a629cc579fe1d4"} Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.923523 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53964d6d55c8e7532c5b1597cd0a7ae1f0b2447543d8237d99a629cc579fe1d4" Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.923587 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv" Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.993718 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj"] Oct 01 16:28:46 crc kubenswrapper[4764]: E1001 16:28:46.994344 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.994413 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.994658 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 16:28:46 crc kubenswrapper[4764]: I1001 16:28:46.995363 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.001533 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.001577 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.003114 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.004306 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.007314 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj"] Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.141075 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4kgt\" (UniqueName: \"kubernetes.io/projected/86c29699-15b0-4a46-b9f6-afb8330b459d-kube-api-access-x4kgt\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dd5gj\" (UID: \"86c29699-15b0-4a46-b9f6-afb8330b459d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.141129 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86c29699-15b0-4a46-b9f6-afb8330b459d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dd5gj\" (UID: \"86c29699-15b0-4a46-b9f6-afb8330b459d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.141153 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86c29699-15b0-4a46-b9f6-afb8330b459d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dd5gj\" (UID: \"86c29699-15b0-4a46-b9f6-afb8330b459d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.243270 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4kgt\" (UniqueName: \"kubernetes.io/projected/86c29699-15b0-4a46-b9f6-afb8330b459d-kube-api-access-x4kgt\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dd5gj\" (UID: \"86c29699-15b0-4a46-b9f6-afb8330b459d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.243314 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86c29699-15b0-4a46-b9f6-afb8330b459d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dd5gj\" (UID: \"86c29699-15b0-4a46-b9f6-afb8330b459d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.243352 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86c29699-15b0-4a46-b9f6-afb8330b459d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dd5gj\" (UID: \"86c29699-15b0-4a46-b9f6-afb8330b459d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.248679 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86c29699-15b0-4a46-b9f6-afb8330b459d-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dd5gj\" (UID: \"86c29699-15b0-4a46-b9f6-afb8330b459d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.248690 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86c29699-15b0-4a46-b9f6-afb8330b459d-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dd5gj\" (UID: \"86c29699-15b0-4a46-b9f6-afb8330b459d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.264498 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4kgt\" (UniqueName: \"kubernetes.io/projected/86c29699-15b0-4a46-b9f6-afb8330b459d-kube-api-access-x4kgt\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dd5gj\" (UID: \"86c29699-15b0-4a46-b9f6-afb8330b459d\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.316762 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.893169 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj"] Oct 01 16:28:47 crc kubenswrapper[4764]: I1001 16:28:47.932196 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" event={"ID":"86c29699-15b0-4a46-b9f6-afb8330b459d","Type":"ContainerStarted","Data":"f87d5d2eb5fad7b2b2b291a5d9569bf047f04adeae7ee9fa7dab052454c295bd"} Oct 01 16:28:49 crc kubenswrapper[4764]: I1001 16:28:49.964137 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" event={"ID":"86c29699-15b0-4a46-b9f6-afb8330b459d","Type":"ContainerStarted","Data":"05522678f7b5b2f2596d8d9a274d086645f451b8d9a21ba60655e5f0ddcb051d"} Oct 01 16:28:49 crc kubenswrapper[4764]: I1001 16:28:49.993169 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" podStartSLOduration=2.509450425 podStartE2EDuration="3.993147955s" podCreationTimestamp="2025-10-01 16:28:46 +0000 UTC" firstStartedPulling="2025-10-01 16:28:47.90741937 +0000 UTC m=+1590.907066215" lastFinishedPulling="2025-10-01 16:28:49.39111688 +0000 UTC m=+1592.390763745" observedRunningTime="2025-10-01 16:28:49.988383609 +0000 UTC m=+1592.988030454" watchObservedRunningTime="2025-10-01 16:28:49.993147955 +0000 UTC m=+1592.992794800" Oct 01 16:28:50 crc kubenswrapper[4764]: I1001 16:28:50.062167 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6jx59"] Oct 01 16:28:50 crc kubenswrapper[4764]: I1001 16:28:50.074128 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-vt657"] Oct 01 16:28:50 crc kubenswrapper[4764]: I1001 16:28:50.087415 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-4p5lt"] Oct 01 16:28:50 crc kubenswrapper[4764]: I1001 16:28:50.102344 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-vt657"] Oct 01 16:28:50 crc kubenswrapper[4764]: I1001 16:28:50.119764 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6jx59"] Oct 01 16:28:50 crc kubenswrapper[4764]: I1001 16:28:50.132941 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-4p5lt"] Oct 01 16:28:51 crc kubenswrapper[4764]: I1001 16:28:51.736350 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b95bc66-5cbe-4ef5-a2db-64f94391bf65" path="/var/lib/kubelet/pods/0b95bc66-5cbe-4ef5-a2db-64f94391bf65/volumes" Oct 01 16:28:51 crc kubenswrapper[4764]: I1001 16:28:51.737535 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22612f67-a7c0-4c0c-9b45-3a2ba2ea7681" path="/var/lib/kubelet/pods/22612f67-a7c0-4c0c-9b45-3a2ba2ea7681/volumes" Oct 01 16:28:51 crc kubenswrapper[4764]: I1001 16:28:51.738275 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="819d701c-51c3-4e8d-a2f4-a9e39b81d65b" path="/var/lib/kubelet/pods/819d701c-51c3-4e8d-a2f4-a9e39b81d65b/volumes" Oct 01 16:28:59 crc kubenswrapper[4764]: I1001 16:28:59.044160 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-242lx"] Oct 01 16:28:59 crc kubenswrapper[4764]: I1001 16:28:59.052934 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-242lx"] Oct 01 16:28:59 crc kubenswrapper[4764]: I1001 16:28:59.732719 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24da24d1-d732-469a-839b-bc1aea3737d8" path="/var/lib/kubelet/pods/24da24d1-d732-469a-839b-bc1aea3737d8/volumes" Oct 01 16:29:05 crc kubenswrapper[4764]: I1001 16:29:05.035271 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-aee5-account-create-sb84m"] Oct 01 16:29:05 crc kubenswrapper[4764]: I1001 16:29:05.049908 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1d44-account-create-54jl9"] Oct 01 16:29:05 crc kubenswrapper[4764]: I1001 16:29:05.062700 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a401-account-create-pqkft"] Oct 01 16:29:05 crc kubenswrapper[4764]: I1001 16:29:05.071528 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1d44-account-create-54jl9"] Oct 01 16:29:05 crc kubenswrapper[4764]: I1001 16:29:05.080194 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a401-account-create-pqkft"] Oct 01 16:29:05 crc kubenswrapper[4764]: I1001 16:29:05.088691 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-aee5-account-create-sb84m"] Oct 01 16:29:05 crc kubenswrapper[4764]: I1001 16:29:05.732290 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b49f677-c2a8-4637-a63f-93f382e73b92" path="/var/lib/kubelet/pods/4b49f677-c2a8-4637-a63f-93f382e73b92/volumes" Oct 01 16:29:05 crc kubenswrapper[4764]: I1001 16:29:05.733141 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e62d88c-83b8-4ab4-8ad9-f231e99b83fe" path="/var/lib/kubelet/pods/9e62d88c-83b8-4ab4-8ad9-f231e99b83fe/volumes" Oct 01 16:29:05 crc kubenswrapper[4764]: I1001 16:29:05.733698 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dac4ce93-5f70-4bc7-8eb6-533aa845be98" path="/var/lib/kubelet/pods/dac4ce93-5f70-4bc7-8eb6-533aa845be98/volumes" Oct 01 16:29:06 crc kubenswrapper[4764]: I1001 16:29:06.047265 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-7qnvc"] Oct 01 16:29:06 crc kubenswrapper[4764]: I1001 16:29:06.054299 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-7qnvc"] Oct 01 16:29:07 crc kubenswrapper[4764]: I1001 16:29:07.736977 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="071b3286-64c8-4945-952e-3ba22f94e118" path="/var/lib/kubelet/pods/071b3286-64c8-4945-952e-3ba22f94e118/volumes" Oct 01 16:29:21 crc kubenswrapper[4764]: I1001 16:29:21.913708 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:29:21 crc kubenswrapper[4764]: I1001 16:29:21.914368 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.251918 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7lqfm"] Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.257190 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.264012 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7lqfm"] Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.328479 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-catalog-content\") pod \"certified-operators-7lqfm\" (UID: \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\") " pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.328766 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-utilities\") pod \"certified-operators-7lqfm\" (UID: \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\") " pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.329104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r99kw\" (UniqueName: \"kubernetes.io/projected/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-kube-api-access-r99kw\") pod \"certified-operators-7lqfm\" (UID: \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\") " pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.400703 4764 generic.go:334] "Generic (PLEG): container finished" podID="86c29699-15b0-4a46-b9f6-afb8330b459d" containerID="05522678f7b5b2f2596d8d9a274d086645f451b8d9a21ba60655e5f0ddcb051d" exitCode=0 Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.400820 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" event={"ID":"86c29699-15b0-4a46-b9f6-afb8330b459d","Type":"ContainerDied","Data":"05522678f7b5b2f2596d8d9a274d086645f451b8d9a21ba60655e5f0ddcb051d"} Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.431020 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-catalog-content\") pod \"certified-operators-7lqfm\" (UID: \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\") " pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.431115 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-utilities\") pod \"certified-operators-7lqfm\" (UID: \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\") " pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.431177 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r99kw\" (UniqueName: \"kubernetes.io/projected/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-kube-api-access-r99kw\") pod \"certified-operators-7lqfm\" (UID: \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\") " pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.431743 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-catalog-content\") pod \"certified-operators-7lqfm\" (UID: \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\") " pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.431763 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-utilities\") pod \"certified-operators-7lqfm\" (UID: \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\") " pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.455676 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r99kw\" (UniqueName: \"kubernetes.io/projected/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-kube-api-access-r99kw\") pod \"certified-operators-7lqfm\" (UID: \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\") " pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:30 crc kubenswrapper[4764]: I1001 16:29:30.592717 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:31 crc kubenswrapper[4764]: I1001 16:29:31.061163 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7lqfm"] Oct 01 16:29:31 crc kubenswrapper[4764]: W1001 16:29:31.068210 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cce1994_45da_4b5e_9f0e_ca1fd68660b4.slice/crio-920de7ce302e699da97e1b5ed22b309d1f55fd91399463a8ee808f47247b97b5 WatchSource:0}: Error finding container 920de7ce302e699da97e1b5ed22b309d1f55fd91399463a8ee808f47247b97b5: Status 404 returned error can't find the container with id 920de7ce302e699da97e1b5ed22b309d1f55fd91399463a8ee808f47247b97b5 Oct 01 16:29:31 crc kubenswrapper[4764]: I1001 16:29:31.413098 4764 generic.go:334] "Generic (PLEG): container finished" podID="9cce1994-45da-4b5e-9f0e-ca1fd68660b4" containerID="42c41db455e88db0d79234b7548c48ec4fc6454fa9b1aec19cd5a87654a5a5de" exitCode=0 Oct 01 16:29:31 crc kubenswrapper[4764]: I1001 16:29:31.413150 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqfm" event={"ID":"9cce1994-45da-4b5e-9f0e-ca1fd68660b4","Type":"ContainerDied","Data":"42c41db455e88db0d79234b7548c48ec4fc6454fa9b1aec19cd5a87654a5a5de"} Oct 01 16:29:31 crc kubenswrapper[4764]: I1001 16:29:31.413486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqfm" event={"ID":"9cce1994-45da-4b5e-9f0e-ca1fd68660b4","Type":"ContainerStarted","Data":"920de7ce302e699da97e1b5ed22b309d1f55fd91399463a8ee808f47247b97b5"} Oct 01 16:29:31 crc kubenswrapper[4764]: I1001 16:29:31.875953 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" Oct 01 16:29:31 crc kubenswrapper[4764]: I1001 16:29:31.961789 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86c29699-15b0-4a46-b9f6-afb8330b459d-ssh-key\") pod \"86c29699-15b0-4a46-b9f6-afb8330b459d\" (UID: \"86c29699-15b0-4a46-b9f6-afb8330b459d\") " Oct 01 16:29:31 crc kubenswrapper[4764]: I1001 16:29:31.962093 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86c29699-15b0-4a46-b9f6-afb8330b459d-inventory\") pod \"86c29699-15b0-4a46-b9f6-afb8330b459d\" (UID: \"86c29699-15b0-4a46-b9f6-afb8330b459d\") " Oct 01 16:29:31 crc kubenswrapper[4764]: I1001 16:29:31.962128 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4kgt\" (UniqueName: \"kubernetes.io/projected/86c29699-15b0-4a46-b9f6-afb8330b459d-kube-api-access-x4kgt\") pod \"86c29699-15b0-4a46-b9f6-afb8330b459d\" (UID: \"86c29699-15b0-4a46-b9f6-afb8330b459d\") " Oct 01 16:29:31 crc kubenswrapper[4764]: I1001 16:29:31.968313 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c29699-15b0-4a46-b9f6-afb8330b459d-kube-api-access-x4kgt" (OuterVolumeSpecName: "kube-api-access-x4kgt") pod "86c29699-15b0-4a46-b9f6-afb8330b459d" (UID: "86c29699-15b0-4a46-b9f6-afb8330b459d"). InnerVolumeSpecName "kube-api-access-x4kgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:29:31 crc kubenswrapper[4764]: I1001 16:29:31.987130 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c29699-15b0-4a46-b9f6-afb8330b459d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "86c29699-15b0-4a46-b9f6-afb8330b459d" (UID: "86c29699-15b0-4a46-b9f6-afb8330b459d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:29:31 crc kubenswrapper[4764]: I1001 16:29:31.993328 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c29699-15b0-4a46-b9f6-afb8330b459d-inventory" (OuterVolumeSpecName: "inventory") pod "86c29699-15b0-4a46-b9f6-afb8330b459d" (UID: "86c29699-15b0-4a46-b9f6-afb8330b459d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.065174 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/86c29699-15b0-4a46-b9f6-afb8330b459d-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.065233 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4kgt\" (UniqueName: \"kubernetes.io/projected/86c29699-15b0-4a46-b9f6-afb8330b459d-kube-api-access-x4kgt\") on node \"crc\" DevicePath \"\"" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.065253 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/86c29699-15b0-4a46-b9f6-afb8330b459d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.424726 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" event={"ID":"86c29699-15b0-4a46-b9f6-afb8330b459d","Type":"ContainerDied","Data":"f87d5d2eb5fad7b2b2b291a5d9569bf047f04adeae7ee9fa7dab052454c295bd"} Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.424850 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f87d5d2eb5fad7b2b2b291a5d9569bf047f04adeae7ee9fa7dab052454c295bd" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.424764 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.427617 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqfm" event={"ID":"9cce1994-45da-4b5e-9f0e-ca1fd68660b4","Type":"ContainerStarted","Data":"a8b530d71441d3cf918d404eab3078ee2cbfeb665c8d3903a90e613f26a0b129"} Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.512310 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc"] Oct 01 16:29:32 crc kubenswrapper[4764]: E1001 16:29:32.512954 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c29699-15b0-4a46-b9f6-afb8330b459d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.512994 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c29699-15b0-4a46-b9f6-afb8330b459d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.513218 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c29699-15b0-4a46-b9f6-afb8330b459d" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.514117 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.517437 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.517460 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.517972 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.524370 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.527729 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc"] Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.574119 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e46429d1-3fd7-40af-8e14-05b775bc1197-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc\" (UID: \"e46429d1-3fd7-40af-8e14-05b775bc1197\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.574194 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e46429d1-3fd7-40af-8e14-05b775bc1197-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc\" (UID: \"e46429d1-3fd7-40af-8e14-05b775bc1197\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.574384 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmprw\" (UniqueName: \"kubernetes.io/projected/e46429d1-3fd7-40af-8e14-05b775bc1197-kube-api-access-hmprw\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc\" (UID: \"e46429d1-3fd7-40af-8e14-05b775bc1197\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.675758 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmprw\" (UniqueName: \"kubernetes.io/projected/e46429d1-3fd7-40af-8e14-05b775bc1197-kube-api-access-hmprw\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc\" (UID: \"e46429d1-3fd7-40af-8e14-05b775bc1197\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.676236 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e46429d1-3fd7-40af-8e14-05b775bc1197-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc\" (UID: \"e46429d1-3fd7-40af-8e14-05b775bc1197\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.676290 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e46429d1-3fd7-40af-8e14-05b775bc1197-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc\" (UID: \"e46429d1-3fd7-40af-8e14-05b775bc1197\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.680186 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e46429d1-3fd7-40af-8e14-05b775bc1197-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc\" (UID: \"e46429d1-3fd7-40af-8e14-05b775bc1197\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.681200 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e46429d1-3fd7-40af-8e14-05b775bc1197-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc\" (UID: \"e46429d1-3fd7-40af-8e14-05b775bc1197\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.694267 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmprw\" (UniqueName: \"kubernetes.io/projected/e46429d1-3fd7-40af-8e14-05b775bc1197-kube-api-access-hmprw\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc\" (UID: \"e46429d1-3fd7-40af-8e14-05b775bc1197\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" Oct 01 16:29:32 crc kubenswrapper[4764]: I1001 16:29:32.844227 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" Oct 01 16:29:33 crc kubenswrapper[4764]: I1001 16:29:33.532342 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc"] Oct 01 16:29:33 crc kubenswrapper[4764]: W1001 16:29:33.539807 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode46429d1_3fd7_40af_8e14_05b775bc1197.slice/crio-092381a3ebcc4f9e820ed946caa587d65c31409b8cde423b3abeba9b9c929c2d WatchSource:0}: Error finding container 092381a3ebcc4f9e820ed946caa587d65c31409b8cde423b3abeba9b9c929c2d: Status 404 returned error can't find the container with id 092381a3ebcc4f9e820ed946caa587d65c31409b8cde423b3abeba9b9c929c2d Oct 01 16:29:33 crc kubenswrapper[4764]: I1001 16:29:33.845506 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-868dq"] Oct 01 16:29:33 crc kubenswrapper[4764]: I1001 16:29:33.850744 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:33 crc kubenswrapper[4764]: I1001 16:29:33.870344 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-868dq"] Oct 01 16:29:33 crc kubenswrapper[4764]: I1001 16:29:33.902683 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d7d807-2803-4e0f-b18d-40de6e75d74c-catalog-content\") pod \"redhat-marketplace-868dq\" (UID: \"74d7d807-2803-4e0f-b18d-40de6e75d74c\") " pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:33 crc kubenswrapper[4764]: I1001 16:29:33.902765 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkx42\" (UniqueName: \"kubernetes.io/projected/74d7d807-2803-4e0f-b18d-40de6e75d74c-kube-api-access-mkx42\") pod \"redhat-marketplace-868dq\" (UID: \"74d7d807-2803-4e0f-b18d-40de6e75d74c\") " pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:33 crc kubenswrapper[4764]: I1001 16:29:33.902819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d7d807-2803-4e0f-b18d-40de6e75d74c-utilities\") pod \"redhat-marketplace-868dq\" (UID: \"74d7d807-2803-4e0f-b18d-40de6e75d74c\") " pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:34 crc kubenswrapper[4764]: I1001 16:29:34.004873 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d7d807-2803-4e0f-b18d-40de6e75d74c-catalog-content\") pod \"redhat-marketplace-868dq\" (UID: \"74d7d807-2803-4e0f-b18d-40de6e75d74c\") " pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:34 crc kubenswrapper[4764]: I1001 16:29:34.004963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkx42\" (UniqueName: \"kubernetes.io/projected/74d7d807-2803-4e0f-b18d-40de6e75d74c-kube-api-access-mkx42\") pod \"redhat-marketplace-868dq\" (UID: \"74d7d807-2803-4e0f-b18d-40de6e75d74c\") " pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:34 crc kubenswrapper[4764]: I1001 16:29:34.005014 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d7d807-2803-4e0f-b18d-40de6e75d74c-utilities\") pod \"redhat-marketplace-868dq\" (UID: \"74d7d807-2803-4e0f-b18d-40de6e75d74c\") " pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:34 crc kubenswrapper[4764]: I1001 16:29:34.005749 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d7d807-2803-4e0f-b18d-40de6e75d74c-utilities\") pod \"redhat-marketplace-868dq\" (UID: \"74d7d807-2803-4e0f-b18d-40de6e75d74c\") " pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:34 crc kubenswrapper[4764]: I1001 16:29:34.005914 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d7d807-2803-4e0f-b18d-40de6e75d74c-catalog-content\") pod \"redhat-marketplace-868dq\" (UID: \"74d7d807-2803-4e0f-b18d-40de6e75d74c\") " pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:34 crc kubenswrapper[4764]: I1001 16:29:34.033453 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkx42\" (UniqueName: \"kubernetes.io/projected/74d7d807-2803-4e0f-b18d-40de6e75d74c-kube-api-access-mkx42\") pod \"redhat-marketplace-868dq\" (UID: \"74d7d807-2803-4e0f-b18d-40de6e75d74c\") " pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:34 crc kubenswrapper[4764]: I1001 16:29:34.188181 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:34 crc kubenswrapper[4764]: I1001 16:29:34.457011 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" event={"ID":"e46429d1-3fd7-40af-8e14-05b775bc1197","Type":"ContainerStarted","Data":"c159541cbac17aabb8fbad292de301d773f006ebd5b80c34af1c738521b69770"} Oct 01 16:29:34 crc kubenswrapper[4764]: I1001 16:29:34.457395 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" event={"ID":"e46429d1-3fd7-40af-8e14-05b775bc1197","Type":"ContainerStarted","Data":"092381a3ebcc4f9e820ed946caa587d65c31409b8cde423b3abeba9b9c929c2d"} Oct 01 16:29:34 crc kubenswrapper[4764]: I1001 16:29:34.460236 4764 generic.go:334] "Generic (PLEG): container finished" podID="9cce1994-45da-4b5e-9f0e-ca1fd68660b4" containerID="a8b530d71441d3cf918d404eab3078ee2cbfeb665c8d3903a90e613f26a0b129" exitCode=0 Oct 01 16:29:34 crc kubenswrapper[4764]: I1001 16:29:34.460287 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqfm" event={"ID":"9cce1994-45da-4b5e-9f0e-ca1fd68660b4","Type":"ContainerDied","Data":"a8b530d71441d3cf918d404eab3078ee2cbfeb665c8d3903a90e613f26a0b129"} Oct 01 16:29:34 crc kubenswrapper[4764]: I1001 16:29:34.484190 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" podStartSLOduration=2.005348875 podStartE2EDuration="2.484172213s" podCreationTimestamp="2025-10-01 16:29:32 +0000 UTC" firstStartedPulling="2025-10-01 16:29:33.544078922 +0000 UTC m=+1636.543725757" lastFinishedPulling="2025-10-01 16:29:34.02290225 +0000 UTC m=+1637.022549095" observedRunningTime="2025-10-01 16:29:34.475496442 +0000 UTC m=+1637.475143287" watchObservedRunningTime="2025-10-01 16:29:34.484172213 +0000 UTC m=+1637.483819048" Oct 01 16:29:34 crc kubenswrapper[4764]: I1001 16:29:34.672603 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-868dq"] Oct 01 16:29:34 crc kubenswrapper[4764]: W1001 16:29:34.676285 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74d7d807_2803_4e0f_b18d_40de6e75d74c.slice/crio-689e31ae9d710f1620bdc4cf2111dd6878a0c3592c5024b318321d535185f3fc WatchSource:0}: Error finding container 689e31ae9d710f1620bdc4cf2111dd6878a0c3592c5024b318321d535185f3fc: Status 404 returned error can't find the container with id 689e31ae9d710f1620bdc4cf2111dd6878a0c3592c5024b318321d535185f3fc Oct 01 16:29:35 crc kubenswrapper[4764]: I1001 16:29:35.474092 4764 generic.go:334] "Generic (PLEG): container finished" podID="74d7d807-2803-4e0f-b18d-40de6e75d74c" containerID="86a5e17e6537f1af40a4eac90621a5f59f05269378c00062306889b4aecf1d22" exitCode=0 Oct 01 16:29:35 crc kubenswrapper[4764]: I1001 16:29:35.474541 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868dq" event={"ID":"74d7d807-2803-4e0f-b18d-40de6e75d74c","Type":"ContainerDied","Data":"86a5e17e6537f1af40a4eac90621a5f59f05269378c00062306889b4aecf1d22"} Oct 01 16:29:35 crc kubenswrapper[4764]: I1001 16:29:35.474572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868dq" event={"ID":"74d7d807-2803-4e0f-b18d-40de6e75d74c","Type":"ContainerStarted","Data":"689e31ae9d710f1620bdc4cf2111dd6878a0c3592c5024b318321d535185f3fc"} Oct 01 16:29:35 crc kubenswrapper[4764]: I1001 16:29:35.477998 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqfm" event={"ID":"9cce1994-45da-4b5e-9f0e-ca1fd68660b4","Type":"ContainerStarted","Data":"423bb36af344e6b5f52bb7f16f3abe89b61a9576749d78650597f8a19b5bf16c"} Oct 01 16:29:35 crc kubenswrapper[4764]: I1001 16:29:35.518431 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7lqfm" podStartSLOduration=2.063229582 podStartE2EDuration="5.518403866s" podCreationTimestamp="2025-10-01 16:29:30 +0000 UTC" firstStartedPulling="2025-10-01 16:29:31.416906953 +0000 UTC m=+1634.416553788" lastFinishedPulling="2025-10-01 16:29:34.872081197 +0000 UTC m=+1637.871728072" observedRunningTime="2025-10-01 16:29:35.509761736 +0000 UTC m=+1638.509408571" watchObservedRunningTime="2025-10-01 16:29:35.518403866 +0000 UTC m=+1638.518050741" Oct 01 16:29:36 crc kubenswrapper[4764]: I1001 16:29:36.491870 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868dq" event={"ID":"74d7d807-2803-4e0f-b18d-40de6e75d74c","Type":"ContainerStarted","Data":"ff423e26b6405c2cfc123a43ed398d37f248e4257b687ec6e5bdab645a38ddf6"} Oct 01 16:29:37 crc kubenswrapper[4764]: I1001 16:29:37.048487 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-7hhkr"] Oct 01 16:29:37 crc kubenswrapper[4764]: I1001 16:29:37.057757 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-7hhkr"] Oct 01 16:29:37 crc kubenswrapper[4764]: I1001 16:29:37.510387 4764 generic.go:334] "Generic (PLEG): container finished" podID="74d7d807-2803-4e0f-b18d-40de6e75d74c" containerID="ff423e26b6405c2cfc123a43ed398d37f248e4257b687ec6e5bdab645a38ddf6" exitCode=0 Oct 01 16:29:37 crc kubenswrapper[4764]: I1001 16:29:37.510439 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868dq" event={"ID":"74d7d807-2803-4e0f-b18d-40de6e75d74c","Type":"ContainerDied","Data":"ff423e26b6405c2cfc123a43ed398d37f248e4257b687ec6e5bdab645a38ddf6"} Oct 01 16:29:37 crc kubenswrapper[4764]: I1001 16:29:37.733335 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c087a812-124b-496c-afe6-b8ba3ca79ada" path="/var/lib/kubelet/pods/c087a812-124b-496c-afe6-b8ba3ca79ada/volumes" Oct 01 16:29:38 crc kubenswrapper[4764]: I1001 16:29:38.033322 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-d9nkl"] Oct 01 16:29:38 crc kubenswrapper[4764]: I1001 16:29:38.041151 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-d9nkl"] Oct 01 16:29:38 crc kubenswrapper[4764]: I1001 16:29:38.522783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868dq" event={"ID":"74d7d807-2803-4e0f-b18d-40de6e75d74c","Type":"ContainerStarted","Data":"f6eaf44d08b64eaf27ee1633b871a991615dbba6a88de16ce7bc0495015045f6"} Oct 01 16:29:38 crc kubenswrapper[4764]: I1001 16:29:38.546341 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-868dq" podStartSLOduration=3.115940981 podStartE2EDuration="5.546322212s" podCreationTimestamp="2025-10-01 16:29:33 +0000 UTC" firstStartedPulling="2025-10-01 16:29:35.477081904 +0000 UTC m=+1638.476728749" lastFinishedPulling="2025-10-01 16:29:37.907463155 +0000 UTC m=+1640.907109980" observedRunningTime="2025-10-01 16:29:38.543854412 +0000 UTC m=+1641.543501257" watchObservedRunningTime="2025-10-01 16:29:38.546322212 +0000 UTC m=+1641.545969047" Oct 01 16:29:39 crc kubenswrapper[4764]: I1001 16:29:39.030440 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-n8qfb"] Oct 01 16:29:39 crc kubenswrapper[4764]: I1001 16:29:39.038558 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-n8qfb"] Oct 01 16:29:39 crc kubenswrapper[4764]: I1001 16:29:39.547321 4764 generic.go:334] "Generic (PLEG): container finished" podID="e46429d1-3fd7-40af-8e14-05b775bc1197" containerID="c159541cbac17aabb8fbad292de301d773f006ebd5b80c34af1c738521b69770" exitCode=0 Oct 01 16:29:39 crc kubenswrapper[4764]: I1001 16:29:39.547406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" event={"ID":"e46429d1-3fd7-40af-8e14-05b775bc1197","Type":"ContainerDied","Data":"c159541cbac17aabb8fbad292de301d773f006ebd5b80c34af1c738521b69770"} Oct 01 16:29:39 crc kubenswrapper[4764]: I1001 16:29:39.743565 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c44de1b-4886-4c2b-a57c-a234a882e4a6" path="/var/lib/kubelet/pods/0c44de1b-4886-4c2b-a57c-a234a882e4a6/volumes" Oct 01 16:29:39 crc kubenswrapper[4764]: I1001 16:29:39.745141 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f26797c-24fd-4c0c-bf6e-5cb3e53c898d" path="/var/lib/kubelet/pods/7f26797c-24fd-4c0c-bf6e-5cb3e53c898d/volumes" Oct 01 16:29:40 crc kubenswrapper[4764]: I1001 16:29:40.593627 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:40 crc kubenswrapper[4764]: I1001 16:29:40.593973 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:40 crc kubenswrapper[4764]: I1001 16:29:40.692867 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.000799 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.047225 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e46429d1-3fd7-40af-8e14-05b775bc1197-ssh-key\") pod \"e46429d1-3fd7-40af-8e14-05b775bc1197\" (UID: \"e46429d1-3fd7-40af-8e14-05b775bc1197\") " Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.047276 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmprw\" (UniqueName: \"kubernetes.io/projected/e46429d1-3fd7-40af-8e14-05b775bc1197-kube-api-access-hmprw\") pod \"e46429d1-3fd7-40af-8e14-05b775bc1197\" (UID: \"e46429d1-3fd7-40af-8e14-05b775bc1197\") " Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.047316 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e46429d1-3fd7-40af-8e14-05b775bc1197-inventory\") pod \"e46429d1-3fd7-40af-8e14-05b775bc1197\" (UID: \"e46429d1-3fd7-40af-8e14-05b775bc1197\") " Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.055562 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e46429d1-3fd7-40af-8e14-05b775bc1197-kube-api-access-hmprw" (OuterVolumeSpecName: "kube-api-access-hmprw") pod "e46429d1-3fd7-40af-8e14-05b775bc1197" (UID: "e46429d1-3fd7-40af-8e14-05b775bc1197"). InnerVolumeSpecName "kube-api-access-hmprw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.078100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e46429d1-3fd7-40af-8e14-05b775bc1197-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e46429d1-3fd7-40af-8e14-05b775bc1197" (UID: "e46429d1-3fd7-40af-8e14-05b775bc1197"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.083441 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e46429d1-3fd7-40af-8e14-05b775bc1197-inventory" (OuterVolumeSpecName: "inventory") pod "e46429d1-3fd7-40af-8e14-05b775bc1197" (UID: "e46429d1-3fd7-40af-8e14-05b775bc1197"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.150029 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e46429d1-3fd7-40af-8e14-05b775bc1197-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.150084 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmprw\" (UniqueName: \"kubernetes.io/projected/e46429d1-3fd7-40af-8e14-05b775bc1197-kube-api-access-hmprw\") on node \"crc\" DevicePath \"\"" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.150122 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e46429d1-3fd7-40af-8e14-05b775bc1197-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.577221 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" event={"ID":"e46429d1-3fd7-40af-8e14-05b775bc1197","Type":"ContainerDied","Data":"092381a3ebcc4f9e820ed946caa587d65c31409b8cde423b3abeba9b9c929c2d"} Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.577277 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.577286 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="092381a3ebcc4f9e820ed946caa587d65c31409b8cde423b3abeba9b9c929c2d" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.646020 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8"] Oct 01 16:29:41 crc kubenswrapper[4764]: E1001 16:29:41.646585 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e46429d1-3fd7-40af-8e14-05b775bc1197" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.646602 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46429d1-3fd7-40af-8e14-05b775bc1197" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.646869 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e46429d1-3fd7-40af-8e14-05b775bc1197" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.647660 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.648289 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8"] Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.699264 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.699468 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.699580 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.699684 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.732440 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.801806 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9qz7\" (UniqueName: \"kubernetes.io/projected/95442ee6-0604-4c09-b597-a2c7d1f19985-kube-api-access-h9qz7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8\" (UID: \"95442ee6-0604-4c09-b597-a2c7d1f19985\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.801851 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95442ee6-0604-4c09-b597-a2c7d1f19985-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8\" (UID: \"95442ee6-0604-4c09-b597-a2c7d1f19985\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.801912 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95442ee6-0604-4c09-b597-a2c7d1f19985-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8\" (UID: \"95442ee6-0604-4c09-b597-a2c7d1f19985\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.904548 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9qz7\" (UniqueName: \"kubernetes.io/projected/95442ee6-0604-4c09-b597-a2c7d1f19985-kube-api-access-h9qz7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8\" (UID: \"95442ee6-0604-4c09-b597-a2c7d1f19985\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.904618 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95442ee6-0604-4c09-b597-a2c7d1f19985-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8\" (UID: \"95442ee6-0604-4c09-b597-a2c7d1f19985\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.904719 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95442ee6-0604-4c09-b597-a2c7d1f19985-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8\" (UID: \"95442ee6-0604-4c09-b597-a2c7d1f19985\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.923278 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95442ee6-0604-4c09-b597-a2c7d1f19985-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8\" (UID: \"95442ee6-0604-4c09-b597-a2c7d1f19985\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.929709 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9qz7\" (UniqueName: \"kubernetes.io/projected/95442ee6-0604-4c09-b597-a2c7d1f19985-kube-api-access-h9qz7\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8\" (UID: \"95442ee6-0604-4c09-b597-a2c7d1f19985\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" Oct 01 16:29:41 crc kubenswrapper[4764]: I1001 16:29:41.934679 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95442ee6-0604-4c09-b597-a2c7d1f19985-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8\" (UID: \"95442ee6-0604-4c09-b597-a2c7d1f19985\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" Oct 01 16:29:42 crc kubenswrapper[4764]: I1001 16:29:42.013585 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" Oct 01 16:29:42 crc kubenswrapper[4764]: I1001 16:29:42.233230 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7lqfm"] Oct 01 16:29:42 crc kubenswrapper[4764]: I1001 16:29:42.614631 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8"] Oct 01 16:29:42 crc kubenswrapper[4764]: W1001 16:29:42.629191 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95442ee6_0604_4c09_b597_a2c7d1f19985.slice/crio-2ed553f168d38629e1034258d0daa966eaab1f10ee18d4ee82db1d74b636a1ed WatchSource:0}: Error finding container 2ed553f168d38629e1034258d0daa966eaab1f10ee18d4ee82db1d74b636a1ed: Status 404 returned error can't find the container with id 2ed553f168d38629e1034258d0daa966eaab1f10ee18d4ee82db1d74b636a1ed Oct 01 16:29:43 crc kubenswrapper[4764]: I1001 16:29:43.611649 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" event={"ID":"95442ee6-0604-4c09-b597-a2c7d1f19985","Type":"ContainerStarted","Data":"1cc8b29e1f2180ae404f33f65f3533100368a78feb0edd58fd47d8b4a4f029fb"} Oct 01 16:29:43 crc kubenswrapper[4764]: I1001 16:29:43.612076 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" event={"ID":"95442ee6-0604-4c09-b597-a2c7d1f19985","Type":"ContainerStarted","Data":"2ed553f168d38629e1034258d0daa966eaab1f10ee18d4ee82db1d74b636a1ed"} Oct 01 16:29:43 crc kubenswrapper[4764]: I1001 16:29:43.615191 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7lqfm" podUID="9cce1994-45da-4b5e-9f0e-ca1fd68660b4" containerName="registry-server" containerID="cri-o://423bb36af344e6b5f52bb7f16f3abe89b61a9576749d78650597f8a19b5bf16c" gracePeriod=2 Oct 01 16:29:43 crc kubenswrapper[4764]: I1001 16:29:43.640622 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" podStartSLOduration=2.11697123 podStartE2EDuration="2.640600344s" podCreationTimestamp="2025-10-01 16:29:41 +0000 UTC" firstStartedPulling="2025-10-01 16:29:42.633838807 +0000 UTC m=+1645.633485672" lastFinishedPulling="2025-10-01 16:29:43.157467941 +0000 UTC m=+1646.157114786" observedRunningTime="2025-10-01 16:29:43.633116783 +0000 UTC m=+1646.632763648" watchObservedRunningTime="2025-10-01 16:29:43.640600344 +0000 UTC m=+1646.640247179" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.035221 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.152543 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-catalog-content\") pod \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\" (UID: \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\") " Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.152827 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-utilities\") pod \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\" (UID: \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\") " Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.152868 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r99kw\" (UniqueName: \"kubernetes.io/projected/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-kube-api-access-r99kw\") pod \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\" (UID: \"9cce1994-45da-4b5e-9f0e-ca1fd68660b4\") " Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.153767 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-utilities" (OuterVolumeSpecName: "utilities") pod "9cce1994-45da-4b5e-9f0e-ca1fd68660b4" (UID: "9cce1994-45da-4b5e-9f0e-ca1fd68660b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.162257 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-kube-api-access-r99kw" (OuterVolumeSpecName: "kube-api-access-r99kw") pod "9cce1994-45da-4b5e-9f0e-ca1fd68660b4" (UID: "9cce1994-45da-4b5e-9f0e-ca1fd68660b4"). InnerVolumeSpecName "kube-api-access-r99kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.188911 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.189272 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.209163 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cce1994-45da-4b5e-9f0e-ca1fd68660b4" (UID: "9cce1994-45da-4b5e-9f0e-ca1fd68660b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.255111 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.255144 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.255154 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r99kw\" (UniqueName: \"kubernetes.io/projected/9cce1994-45da-4b5e-9f0e-ca1fd68660b4-kube-api-access-r99kw\") on node \"crc\" DevicePath \"\"" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.263653 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.623770 4764 generic.go:334] "Generic (PLEG): container finished" podID="9cce1994-45da-4b5e-9f0e-ca1fd68660b4" containerID="423bb36af344e6b5f52bb7f16f3abe89b61a9576749d78650597f8a19b5bf16c" exitCode=0 Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.624010 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqfm" event={"ID":"9cce1994-45da-4b5e-9f0e-ca1fd68660b4","Type":"ContainerDied","Data":"423bb36af344e6b5f52bb7f16f3abe89b61a9576749d78650597f8a19b5bf16c"} Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.625312 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqfm" event={"ID":"9cce1994-45da-4b5e-9f0e-ca1fd68660b4","Type":"ContainerDied","Data":"920de7ce302e699da97e1b5ed22b309d1f55fd91399463a8ee808f47247b97b5"} Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.625344 4764 scope.go:117] "RemoveContainer" containerID="423bb36af344e6b5f52bb7f16f3abe89b61a9576749d78650597f8a19b5bf16c" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.625660 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lqfm" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.660071 4764 scope.go:117] "RemoveContainer" containerID="a8b530d71441d3cf918d404eab3078ee2cbfeb665c8d3903a90e613f26a0b129" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.675131 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7lqfm"] Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.682185 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.683277 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7lqfm"] Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.686564 4764 scope.go:117] "RemoveContainer" containerID="42c41db455e88db0d79234b7548c48ec4fc6454fa9b1aec19cd5a87654a5a5de" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.759599 4764 scope.go:117] "RemoveContainer" containerID="423bb36af344e6b5f52bb7f16f3abe89b61a9576749d78650597f8a19b5bf16c" Oct 01 16:29:44 crc kubenswrapper[4764]: E1001 16:29:44.760399 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"423bb36af344e6b5f52bb7f16f3abe89b61a9576749d78650597f8a19b5bf16c\": container with ID starting with 423bb36af344e6b5f52bb7f16f3abe89b61a9576749d78650597f8a19b5bf16c not found: ID does not exist" containerID="423bb36af344e6b5f52bb7f16f3abe89b61a9576749d78650597f8a19b5bf16c" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.760442 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"423bb36af344e6b5f52bb7f16f3abe89b61a9576749d78650597f8a19b5bf16c"} err="failed to get container status \"423bb36af344e6b5f52bb7f16f3abe89b61a9576749d78650597f8a19b5bf16c\": rpc error: code = NotFound desc = could not find container \"423bb36af344e6b5f52bb7f16f3abe89b61a9576749d78650597f8a19b5bf16c\": container with ID starting with 423bb36af344e6b5f52bb7f16f3abe89b61a9576749d78650597f8a19b5bf16c not found: ID does not exist" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.760471 4764 scope.go:117] "RemoveContainer" containerID="a8b530d71441d3cf918d404eab3078ee2cbfeb665c8d3903a90e613f26a0b129" Oct 01 16:29:44 crc kubenswrapper[4764]: E1001 16:29:44.763374 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8b530d71441d3cf918d404eab3078ee2cbfeb665c8d3903a90e613f26a0b129\": container with ID starting with a8b530d71441d3cf918d404eab3078ee2cbfeb665c8d3903a90e613f26a0b129 not found: ID does not exist" containerID="a8b530d71441d3cf918d404eab3078ee2cbfeb665c8d3903a90e613f26a0b129" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.763436 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b530d71441d3cf918d404eab3078ee2cbfeb665c8d3903a90e613f26a0b129"} err="failed to get container status \"a8b530d71441d3cf918d404eab3078ee2cbfeb665c8d3903a90e613f26a0b129\": rpc error: code = NotFound desc = could not find container \"a8b530d71441d3cf918d404eab3078ee2cbfeb665c8d3903a90e613f26a0b129\": container with ID starting with a8b530d71441d3cf918d404eab3078ee2cbfeb665c8d3903a90e613f26a0b129 not found: ID does not exist" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.763475 4764 scope.go:117] "RemoveContainer" containerID="42c41db455e88db0d79234b7548c48ec4fc6454fa9b1aec19cd5a87654a5a5de" Oct 01 16:29:44 crc kubenswrapper[4764]: E1001 16:29:44.764113 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c41db455e88db0d79234b7548c48ec4fc6454fa9b1aec19cd5a87654a5a5de\": container with ID starting with 42c41db455e88db0d79234b7548c48ec4fc6454fa9b1aec19cd5a87654a5a5de not found: ID does not exist" containerID="42c41db455e88db0d79234b7548c48ec4fc6454fa9b1aec19cd5a87654a5a5de" Oct 01 16:29:44 crc kubenswrapper[4764]: I1001 16:29:44.764147 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c41db455e88db0d79234b7548c48ec4fc6454fa9b1aec19cd5a87654a5a5de"} err="failed to get container status \"42c41db455e88db0d79234b7548c48ec4fc6454fa9b1aec19cd5a87654a5a5de\": rpc error: code = NotFound desc = could not find container \"42c41db455e88db0d79234b7548c48ec4fc6454fa9b1aec19cd5a87654a5a5de\": container with ID starting with 42c41db455e88db0d79234b7548c48ec4fc6454fa9b1aec19cd5a87654a5a5de not found: ID does not exist" Oct 01 16:29:45 crc kubenswrapper[4764]: I1001 16:29:45.211852 4764 scope.go:117] "RemoveContainer" containerID="4f926bf754746465e73cf6dcb7941c46d315fd5b4b4fc76e1dfce2b33141c411" Oct 01 16:29:45 crc kubenswrapper[4764]: I1001 16:29:45.250800 4764 scope.go:117] "RemoveContainer" containerID="0b85c0940795c20d982fadb7c007e7cbc4dcf1de6aa55faf30bc0cf05b2e799e" Oct 01 16:29:45 crc kubenswrapper[4764]: I1001 16:29:45.314802 4764 scope.go:117] "RemoveContainer" containerID="817a62351be6bef79b4ca476bdf4db00f071e236e6ca54a7b052a3a8e5f809d8" Oct 01 16:29:45 crc kubenswrapper[4764]: I1001 16:29:45.361656 4764 scope.go:117] "RemoveContainer" containerID="a0de567bbcc81cba6f96918fc7f6231c862b2e7b0b3e0b4940bc97e6ab986e63" Oct 01 16:29:45 crc kubenswrapper[4764]: I1001 16:29:45.411599 4764 scope.go:117] "RemoveContainer" containerID="2630a8464bfcbfc8eb1556493963ba1a7431254312e294b4cd55e42ef4474e09" Oct 01 16:29:45 crc kubenswrapper[4764]: I1001 16:29:45.444764 4764 scope.go:117] "RemoveContainer" containerID="6558f414897181533fe9abd810770c686269aa91cc01be5c407a467c1c81f04b" Oct 01 16:29:45 crc kubenswrapper[4764]: I1001 16:29:45.510160 4764 scope.go:117] "RemoveContainer" containerID="95ec0f90aea22f05b6529dfe7c35539225a3ac47b4d5fec9e8db24fbe416958d" Oct 01 16:29:45 crc kubenswrapper[4764]: I1001 16:29:45.533811 4764 scope.go:117] "RemoveContainer" containerID="cecb434757cfeb57580889448b4d14a0327e24caa925d5bf70f67a33d376a0a9" Oct 01 16:29:45 crc kubenswrapper[4764]: I1001 16:29:45.578597 4764 scope.go:117] "RemoveContainer" containerID="b740fd1f3a00f767c5ff228b40b5c7b4c7955318075b851da3594e28395ce1b2" Oct 01 16:29:45 crc kubenswrapper[4764]: I1001 16:29:45.599565 4764 scope.go:117] "RemoveContainer" containerID="381f240de68976a99439117182e6687d000f39f56f744fb172deecd5f3bcdd24" Oct 01 16:29:45 crc kubenswrapper[4764]: I1001 16:29:45.659558 4764 scope.go:117] "RemoveContainer" containerID="8aae9048d9df1908aec76735625fd07337d605eb944d97a919c0fc9ce183c22c" Oct 01 16:29:45 crc kubenswrapper[4764]: I1001 16:29:45.735380 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cce1994-45da-4b5e-9f0e-ca1fd68660b4" path="/var/lib/kubelet/pods/9cce1994-45da-4b5e-9f0e-ca1fd68660b4/volumes" Oct 01 16:29:46 crc kubenswrapper[4764]: I1001 16:29:46.439240 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-868dq"] Oct 01 16:29:47 crc kubenswrapper[4764]: I1001 16:29:47.059097 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vhgdf"] Oct 01 16:29:47 crc kubenswrapper[4764]: I1001 16:29:47.069590 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vhgdf"] Oct 01 16:29:47 crc kubenswrapper[4764]: I1001 16:29:47.687322 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-868dq" podUID="74d7d807-2803-4e0f-b18d-40de6e75d74c" containerName="registry-server" containerID="cri-o://f6eaf44d08b64eaf27ee1633b871a991615dbba6a88de16ce7bc0495015045f6" gracePeriod=2 Oct 01 16:29:47 crc kubenswrapper[4764]: I1001 16:29:47.739099 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b75f147-9726-4336-8467-932ad4ff15f1" path="/var/lib/kubelet/pods/7b75f147-9726-4336-8467-932ad4ff15f1/volumes" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.213167 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.337339 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d7d807-2803-4e0f-b18d-40de6e75d74c-utilities\") pod \"74d7d807-2803-4e0f-b18d-40de6e75d74c\" (UID: \"74d7d807-2803-4e0f-b18d-40de6e75d74c\") " Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.337424 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkx42\" (UniqueName: \"kubernetes.io/projected/74d7d807-2803-4e0f-b18d-40de6e75d74c-kube-api-access-mkx42\") pod \"74d7d807-2803-4e0f-b18d-40de6e75d74c\" (UID: \"74d7d807-2803-4e0f-b18d-40de6e75d74c\") " Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.337479 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d7d807-2803-4e0f-b18d-40de6e75d74c-catalog-content\") pod \"74d7d807-2803-4e0f-b18d-40de6e75d74c\" (UID: \"74d7d807-2803-4e0f-b18d-40de6e75d74c\") " Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.338613 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d7d807-2803-4e0f-b18d-40de6e75d74c-utilities" (OuterVolumeSpecName: "utilities") pod "74d7d807-2803-4e0f-b18d-40de6e75d74c" (UID: "74d7d807-2803-4e0f-b18d-40de6e75d74c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.343799 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d7d807-2803-4e0f-b18d-40de6e75d74c-kube-api-access-mkx42" (OuterVolumeSpecName: "kube-api-access-mkx42") pod "74d7d807-2803-4e0f-b18d-40de6e75d74c" (UID: "74d7d807-2803-4e0f-b18d-40de6e75d74c"). InnerVolumeSpecName "kube-api-access-mkx42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.353664 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d7d807-2803-4e0f-b18d-40de6e75d74c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74d7d807-2803-4e0f-b18d-40de6e75d74c" (UID: "74d7d807-2803-4e0f-b18d-40de6e75d74c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.439636 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d7d807-2803-4e0f-b18d-40de6e75d74c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.439663 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkx42\" (UniqueName: \"kubernetes.io/projected/74d7d807-2803-4e0f-b18d-40de6e75d74c-kube-api-access-mkx42\") on node \"crc\" DevicePath \"\"" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.439674 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d7d807-2803-4e0f-b18d-40de6e75d74c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.695309 4764 generic.go:334] "Generic (PLEG): container finished" podID="74d7d807-2803-4e0f-b18d-40de6e75d74c" containerID="f6eaf44d08b64eaf27ee1633b871a991615dbba6a88de16ce7bc0495015045f6" exitCode=0 Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.695355 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868dq" event={"ID":"74d7d807-2803-4e0f-b18d-40de6e75d74c","Type":"ContainerDied","Data":"f6eaf44d08b64eaf27ee1633b871a991615dbba6a88de16ce7bc0495015045f6"} Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.695380 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-868dq" event={"ID":"74d7d807-2803-4e0f-b18d-40de6e75d74c","Type":"ContainerDied","Data":"689e31ae9d710f1620bdc4cf2111dd6878a0c3592c5024b318321d535185f3fc"} Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.695398 4764 scope.go:117] "RemoveContainer" containerID="f6eaf44d08b64eaf27ee1633b871a991615dbba6a88de16ce7bc0495015045f6" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.695519 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-868dq" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.777424 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-868dq"] Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.786901 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-868dq"] Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.791483 4764 scope.go:117] "RemoveContainer" containerID="ff423e26b6405c2cfc123a43ed398d37f248e4257b687ec6e5bdab645a38ddf6" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.822536 4764 scope.go:117] "RemoveContainer" containerID="86a5e17e6537f1af40a4eac90621a5f59f05269378c00062306889b4aecf1d22" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.871894 4764 scope.go:117] "RemoveContainer" containerID="f6eaf44d08b64eaf27ee1633b871a991615dbba6a88de16ce7bc0495015045f6" Oct 01 16:29:48 crc kubenswrapper[4764]: E1001 16:29:48.872523 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6eaf44d08b64eaf27ee1633b871a991615dbba6a88de16ce7bc0495015045f6\": container with ID starting with f6eaf44d08b64eaf27ee1633b871a991615dbba6a88de16ce7bc0495015045f6 not found: ID does not exist" containerID="f6eaf44d08b64eaf27ee1633b871a991615dbba6a88de16ce7bc0495015045f6" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.872574 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6eaf44d08b64eaf27ee1633b871a991615dbba6a88de16ce7bc0495015045f6"} err="failed to get container status \"f6eaf44d08b64eaf27ee1633b871a991615dbba6a88de16ce7bc0495015045f6\": rpc error: code = NotFound desc = could not find container \"f6eaf44d08b64eaf27ee1633b871a991615dbba6a88de16ce7bc0495015045f6\": container with ID starting with f6eaf44d08b64eaf27ee1633b871a991615dbba6a88de16ce7bc0495015045f6 not found: ID does not exist" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.872605 4764 scope.go:117] "RemoveContainer" containerID="ff423e26b6405c2cfc123a43ed398d37f248e4257b687ec6e5bdab645a38ddf6" Oct 01 16:29:48 crc kubenswrapper[4764]: E1001 16:29:48.873089 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff423e26b6405c2cfc123a43ed398d37f248e4257b687ec6e5bdab645a38ddf6\": container with ID starting with ff423e26b6405c2cfc123a43ed398d37f248e4257b687ec6e5bdab645a38ddf6 not found: ID does not exist" containerID="ff423e26b6405c2cfc123a43ed398d37f248e4257b687ec6e5bdab645a38ddf6" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.873135 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff423e26b6405c2cfc123a43ed398d37f248e4257b687ec6e5bdab645a38ddf6"} err="failed to get container status \"ff423e26b6405c2cfc123a43ed398d37f248e4257b687ec6e5bdab645a38ddf6\": rpc error: code = NotFound desc = could not find container \"ff423e26b6405c2cfc123a43ed398d37f248e4257b687ec6e5bdab645a38ddf6\": container with ID starting with ff423e26b6405c2cfc123a43ed398d37f248e4257b687ec6e5bdab645a38ddf6 not found: ID does not exist" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.873162 4764 scope.go:117] "RemoveContainer" containerID="86a5e17e6537f1af40a4eac90621a5f59f05269378c00062306889b4aecf1d22" Oct 01 16:29:48 crc kubenswrapper[4764]: E1001 16:29:48.874408 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86a5e17e6537f1af40a4eac90621a5f59f05269378c00062306889b4aecf1d22\": container with ID starting with 86a5e17e6537f1af40a4eac90621a5f59f05269378c00062306889b4aecf1d22 not found: ID does not exist" containerID="86a5e17e6537f1af40a4eac90621a5f59f05269378c00062306889b4aecf1d22" Oct 01 16:29:48 crc kubenswrapper[4764]: I1001 16:29:48.874480 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86a5e17e6537f1af40a4eac90621a5f59f05269378c00062306889b4aecf1d22"} err="failed to get container status \"86a5e17e6537f1af40a4eac90621a5f59f05269378c00062306889b4aecf1d22\": rpc error: code = NotFound desc = could not find container \"86a5e17e6537f1af40a4eac90621a5f59f05269378c00062306889b4aecf1d22\": container with ID starting with 86a5e17e6537f1af40a4eac90621a5f59f05269378c00062306889b4aecf1d22 not found: ID does not exist" Oct 01 16:29:49 crc kubenswrapper[4764]: I1001 16:29:49.737431 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d7d807-2803-4e0f-b18d-40de6e75d74c" path="/var/lib/kubelet/pods/74d7d807-2803-4e0f-b18d-40de6e75d74c/volumes" Oct 01 16:29:51 crc kubenswrapper[4764]: I1001 16:29:51.915037 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:29:51 crc kubenswrapper[4764]: I1001 16:29:51.915424 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:29:59 crc kubenswrapper[4764]: I1001 16:29:59.028348 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-dkmsh"] Oct 01 16:29:59 crc kubenswrapper[4764]: I1001 16:29:59.034895 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-dkmsh"] Oct 01 16:29:59 crc kubenswrapper[4764]: I1001 16:29:59.742949 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44b571b7-d584-46bf-823a-bf8ce35c8dac" path="/var/lib/kubelet/pods/44b571b7-d584-46bf-823a-bf8ce35c8dac/volumes" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.142615 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496"] Oct 01 16:30:00 crc kubenswrapper[4764]: E1001 16:30:00.143281 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d7d807-2803-4e0f-b18d-40de6e75d74c" containerName="extract-content" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.143634 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d7d807-2803-4e0f-b18d-40de6e75d74c" containerName="extract-content" Oct 01 16:30:00 crc kubenswrapper[4764]: E1001 16:30:00.143669 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cce1994-45da-4b5e-9f0e-ca1fd68660b4" containerName="extract-utilities" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.143683 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cce1994-45da-4b5e-9f0e-ca1fd68660b4" containerName="extract-utilities" Oct 01 16:30:00 crc kubenswrapper[4764]: E1001 16:30:00.143707 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cce1994-45da-4b5e-9f0e-ca1fd68660b4" containerName="extract-content" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.143722 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cce1994-45da-4b5e-9f0e-ca1fd68660b4" containerName="extract-content" Oct 01 16:30:00 crc kubenswrapper[4764]: E1001 16:30:00.143757 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d7d807-2803-4e0f-b18d-40de6e75d74c" containerName="extract-utilities" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.143771 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d7d807-2803-4e0f-b18d-40de6e75d74c" containerName="extract-utilities" Oct 01 16:30:00 crc kubenswrapper[4764]: E1001 16:30:00.143809 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cce1994-45da-4b5e-9f0e-ca1fd68660b4" containerName="registry-server" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.143821 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cce1994-45da-4b5e-9f0e-ca1fd68660b4" containerName="registry-server" Oct 01 16:30:00 crc kubenswrapper[4764]: E1001 16:30:00.143845 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d7d807-2803-4e0f-b18d-40de6e75d74c" containerName="registry-server" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.143857 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d7d807-2803-4e0f-b18d-40de6e75d74c" containerName="registry-server" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.144196 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d7d807-2803-4e0f-b18d-40de6e75d74c" containerName="registry-server" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.144243 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cce1994-45da-4b5e-9f0e-ca1fd68660b4" containerName="registry-server" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.145348 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.148699 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.155527 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.161319 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496"] Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.315446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwrx6\" (UniqueName: \"kubernetes.io/projected/63ee5fea-bbee-4155-a586-fbc9af7e8608-kube-api-access-gwrx6\") pod \"collect-profiles-29322270-rh496\" (UID: \"63ee5fea-bbee-4155-a586-fbc9af7e8608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.315858 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63ee5fea-bbee-4155-a586-fbc9af7e8608-secret-volume\") pod \"collect-profiles-29322270-rh496\" (UID: \"63ee5fea-bbee-4155-a586-fbc9af7e8608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.316042 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63ee5fea-bbee-4155-a586-fbc9af7e8608-config-volume\") pod \"collect-profiles-29322270-rh496\" (UID: \"63ee5fea-bbee-4155-a586-fbc9af7e8608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.418200 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63ee5fea-bbee-4155-a586-fbc9af7e8608-secret-volume\") pod \"collect-profiles-29322270-rh496\" (UID: \"63ee5fea-bbee-4155-a586-fbc9af7e8608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.418321 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63ee5fea-bbee-4155-a586-fbc9af7e8608-config-volume\") pod \"collect-profiles-29322270-rh496\" (UID: \"63ee5fea-bbee-4155-a586-fbc9af7e8608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.418395 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwrx6\" (UniqueName: \"kubernetes.io/projected/63ee5fea-bbee-4155-a586-fbc9af7e8608-kube-api-access-gwrx6\") pod \"collect-profiles-29322270-rh496\" (UID: \"63ee5fea-bbee-4155-a586-fbc9af7e8608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.420462 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63ee5fea-bbee-4155-a586-fbc9af7e8608-config-volume\") pod \"collect-profiles-29322270-rh496\" (UID: \"63ee5fea-bbee-4155-a586-fbc9af7e8608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.427092 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63ee5fea-bbee-4155-a586-fbc9af7e8608-secret-volume\") pod \"collect-profiles-29322270-rh496\" (UID: \"63ee5fea-bbee-4155-a586-fbc9af7e8608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.443874 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwrx6\" (UniqueName: \"kubernetes.io/projected/63ee5fea-bbee-4155-a586-fbc9af7e8608-kube-api-access-gwrx6\") pod \"collect-profiles-29322270-rh496\" (UID: \"63ee5fea-bbee-4155-a586-fbc9af7e8608\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.468712 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" Oct 01 16:30:00 crc kubenswrapper[4764]: I1001 16:30:00.911560 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496"] Oct 01 16:30:00 crc kubenswrapper[4764]: W1001 16:30:00.917275 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63ee5fea_bbee_4155_a586_fbc9af7e8608.slice/crio-7170a7ef16798f81c53da9be6441625ba244058ee0606440b85c5cdfad7859fe WatchSource:0}: Error finding container 7170a7ef16798f81c53da9be6441625ba244058ee0606440b85c5cdfad7859fe: Status 404 returned error can't find the container with id 7170a7ef16798f81c53da9be6441625ba244058ee0606440b85c5cdfad7859fe Oct 01 16:30:01 crc kubenswrapper[4764]: I1001 16:30:01.852249 4764 generic.go:334] "Generic (PLEG): container finished" podID="63ee5fea-bbee-4155-a586-fbc9af7e8608" containerID="c5fb19bc9012f9a463edd56225fc928d105aed2a16ef93d28d0f2024593ad666" exitCode=0 Oct 01 16:30:01 crc kubenswrapper[4764]: I1001 16:30:01.852312 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" event={"ID":"63ee5fea-bbee-4155-a586-fbc9af7e8608","Type":"ContainerDied","Data":"c5fb19bc9012f9a463edd56225fc928d105aed2a16ef93d28d0f2024593ad666"} Oct 01 16:30:01 crc kubenswrapper[4764]: I1001 16:30:01.852598 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" event={"ID":"63ee5fea-bbee-4155-a586-fbc9af7e8608","Type":"ContainerStarted","Data":"7170a7ef16798f81c53da9be6441625ba244058ee0606440b85c5cdfad7859fe"} Oct 01 16:30:03 crc kubenswrapper[4764]: I1001 16:30:03.207885 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" Oct 01 16:30:03 crc kubenswrapper[4764]: I1001 16:30:03.384877 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63ee5fea-bbee-4155-a586-fbc9af7e8608-config-volume\") pod \"63ee5fea-bbee-4155-a586-fbc9af7e8608\" (UID: \"63ee5fea-bbee-4155-a586-fbc9af7e8608\") " Oct 01 16:30:03 crc kubenswrapper[4764]: I1001 16:30:03.386096 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63ee5fea-bbee-4155-a586-fbc9af7e8608-config-volume" (OuterVolumeSpecName: "config-volume") pod "63ee5fea-bbee-4155-a586-fbc9af7e8608" (UID: "63ee5fea-bbee-4155-a586-fbc9af7e8608"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:30:03 crc kubenswrapper[4764]: I1001 16:30:03.386458 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwrx6\" (UniqueName: \"kubernetes.io/projected/63ee5fea-bbee-4155-a586-fbc9af7e8608-kube-api-access-gwrx6\") pod \"63ee5fea-bbee-4155-a586-fbc9af7e8608\" (UID: \"63ee5fea-bbee-4155-a586-fbc9af7e8608\") " Oct 01 16:30:03 crc kubenswrapper[4764]: I1001 16:30:03.387582 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63ee5fea-bbee-4155-a586-fbc9af7e8608-secret-volume\") pod \"63ee5fea-bbee-4155-a586-fbc9af7e8608\" (UID: \"63ee5fea-bbee-4155-a586-fbc9af7e8608\") " Oct 01 16:30:03 crc kubenswrapper[4764]: I1001 16:30:03.388531 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63ee5fea-bbee-4155-a586-fbc9af7e8608-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 16:30:03 crc kubenswrapper[4764]: I1001 16:30:03.392263 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ee5fea-bbee-4155-a586-fbc9af7e8608-kube-api-access-gwrx6" (OuterVolumeSpecName: "kube-api-access-gwrx6") pod "63ee5fea-bbee-4155-a586-fbc9af7e8608" (UID: "63ee5fea-bbee-4155-a586-fbc9af7e8608"). InnerVolumeSpecName "kube-api-access-gwrx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:30:03 crc kubenswrapper[4764]: I1001 16:30:03.393174 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63ee5fea-bbee-4155-a586-fbc9af7e8608-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "63ee5fea-bbee-4155-a586-fbc9af7e8608" (UID: "63ee5fea-bbee-4155-a586-fbc9af7e8608"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:30:03 crc kubenswrapper[4764]: I1001 16:30:03.490418 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwrx6\" (UniqueName: \"kubernetes.io/projected/63ee5fea-bbee-4155-a586-fbc9af7e8608-kube-api-access-gwrx6\") on node \"crc\" DevicePath \"\"" Oct 01 16:30:03 crc kubenswrapper[4764]: I1001 16:30:03.490454 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63ee5fea-bbee-4155-a586-fbc9af7e8608-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 16:30:03 crc kubenswrapper[4764]: I1001 16:30:03.874782 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" event={"ID":"63ee5fea-bbee-4155-a586-fbc9af7e8608","Type":"ContainerDied","Data":"7170a7ef16798f81c53da9be6441625ba244058ee0606440b85c5cdfad7859fe"} Oct 01 16:30:03 crc kubenswrapper[4764]: I1001 16:30:03.874849 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7170a7ef16798f81c53da9be6441625ba244058ee0606440b85c5cdfad7859fe" Oct 01 16:30:03 crc kubenswrapper[4764]: I1001 16:30:03.874919 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496" Oct 01 16:30:21 crc kubenswrapper[4764]: I1001 16:30:21.914515 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:30:21 crc kubenswrapper[4764]: I1001 16:30:21.915139 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:30:21 crc kubenswrapper[4764]: I1001 16:30:21.915208 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:30:21 crc kubenswrapper[4764]: I1001 16:30:21.916175 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:30:21 crc kubenswrapper[4764]: I1001 16:30:21.916258 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" gracePeriod=600 Oct 01 16:30:22 crc kubenswrapper[4764]: E1001 16:30:22.043305 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:30:22 crc kubenswrapper[4764]: I1001 16:30:22.086807 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" exitCode=0 Oct 01 16:30:22 crc kubenswrapper[4764]: I1001 16:30:22.086886 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176"} Oct 01 16:30:22 crc kubenswrapper[4764]: I1001 16:30:22.086962 4764 scope.go:117] "RemoveContainer" containerID="4bf2e42740725b9d54c8d60efb2a207718601c4c6231f1e898fc274c1b294773" Oct 01 16:30:22 crc kubenswrapper[4764]: I1001 16:30:22.088022 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:30:22 crc kubenswrapper[4764]: E1001 16:30:22.088520 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:30:27 crc kubenswrapper[4764]: I1001 16:30:27.051340 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-t4ftm"] Oct 01 16:30:27 crc kubenswrapper[4764]: I1001 16:30:27.062525 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-t4ftm"] Oct 01 16:30:27 crc kubenswrapper[4764]: I1001 16:30:27.734390 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8756a942-4b09-43eb-b2a5-30048c7fa903" path="/var/lib/kubelet/pods/8756a942-4b09-43eb-b2a5-30048c7fa903/volumes" Oct 01 16:30:28 crc kubenswrapper[4764]: I1001 16:30:28.034409 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-x59d8"] Oct 01 16:30:28 crc kubenswrapper[4764]: I1001 16:30:28.044669 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-mz6gj"] Oct 01 16:30:28 crc kubenswrapper[4764]: I1001 16:30:28.054165 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-mz6gj"] Oct 01 16:30:28 crc kubenswrapper[4764]: I1001 16:30:28.061726 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-x59d8"] Oct 01 16:30:29 crc kubenswrapper[4764]: I1001 16:30:29.731108 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3867e107-7704-4f0f-acf6-356bd78d71af" path="/var/lib/kubelet/pods/3867e107-7704-4f0f-acf6-356bd78d71af/volumes" Oct 01 16:30:29 crc kubenswrapper[4764]: I1001 16:30:29.731804 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7418243b-5584-4f53-bded-549e34e415ca" path="/var/lib/kubelet/pods/7418243b-5584-4f53-bded-549e34e415ca/volumes" Oct 01 16:30:36 crc kubenswrapper[4764]: I1001 16:30:36.722562 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:30:36 crc kubenswrapper[4764]: E1001 16:30:36.723320 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:30:37 crc kubenswrapper[4764]: I1001 16:30:37.031874 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9ed4-account-create-q2w86"] Oct 01 16:30:37 crc kubenswrapper[4764]: I1001 16:30:37.041505 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8701-account-create-79w9q"] Oct 01 16:30:37 crc kubenswrapper[4764]: I1001 16:30:37.052185 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9ed4-account-create-q2w86"] Oct 01 16:30:37 crc kubenswrapper[4764]: I1001 16:30:37.058666 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8701-account-create-79w9q"] Oct 01 16:30:37 crc kubenswrapper[4764]: I1001 16:30:37.734288 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="211211ae-6ed4-434c-86a7-f58ef4c5428b" path="/var/lib/kubelet/pods/211211ae-6ed4-434c-86a7-f58ef4c5428b/volumes" Oct 01 16:30:37 crc kubenswrapper[4764]: I1001 16:30:37.735368 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c79489-3d24-4631-b12e-3df33c87c1e0" path="/var/lib/kubelet/pods/a8c79489-3d24-4631-b12e-3df33c87c1e0/volumes" Oct 01 16:30:38 crc kubenswrapper[4764]: I1001 16:30:38.278197 4764 generic.go:334] "Generic (PLEG): container finished" podID="95442ee6-0604-4c09-b597-a2c7d1f19985" containerID="1cc8b29e1f2180ae404f33f65f3533100368a78feb0edd58fd47d8b4a4f029fb" exitCode=2 Oct 01 16:30:38 crc kubenswrapper[4764]: I1001 16:30:38.278239 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" event={"ID":"95442ee6-0604-4c09-b597-a2c7d1f19985","Type":"ContainerDied","Data":"1cc8b29e1f2180ae404f33f65f3533100368a78feb0edd58fd47d8b4a4f029fb"} Oct 01 16:30:39 crc kubenswrapper[4764]: I1001 16:30:39.764398 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" Oct 01 16:30:39 crc kubenswrapper[4764]: I1001 16:30:39.936665 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95442ee6-0604-4c09-b597-a2c7d1f19985-inventory\") pod \"95442ee6-0604-4c09-b597-a2c7d1f19985\" (UID: \"95442ee6-0604-4c09-b597-a2c7d1f19985\") " Oct 01 16:30:39 crc kubenswrapper[4764]: I1001 16:30:39.936778 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9qz7\" (UniqueName: \"kubernetes.io/projected/95442ee6-0604-4c09-b597-a2c7d1f19985-kube-api-access-h9qz7\") pod \"95442ee6-0604-4c09-b597-a2c7d1f19985\" (UID: \"95442ee6-0604-4c09-b597-a2c7d1f19985\") " Oct 01 16:30:39 crc kubenswrapper[4764]: I1001 16:30:39.936909 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95442ee6-0604-4c09-b597-a2c7d1f19985-ssh-key\") pod \"95442ee6-0604-4c09-b597-a2c7d1f19985\" (UID: \"95442ee6-0604-4c09-b597-a2c7d1f19985\") " Oct 01 16:30:39 crc kubenswrapper[4764]: I1001 16:30:39.941631 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95442ee6-0604-4c09-b597-a2c7d1f19985-kube-api-access-h9qz7" (OuterVolumeSpecName: "kube-api-access-h9qz7") pod "95442ee6-0604-4c09-b597-a2c7d1f19985" (UID: "95442ee6-0604-4c09-b597-a2c7d1f19985"). InnerVolumeSpecName "kube-api-access-h9qz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:30:39 crc kubenswrapper[4764]: I1001 16:30:39.961670 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95442ee6-0604-4c09-b597-a2c7d1f19985-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "95442ee6-0604-4c09-b597-a2c7d1f19985" (UID: "95442ee6-0604-4c09-b597-a2c7d1f19985"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:30:39 crc kubenswrapper[4764]: I1001 16:30:39.962625 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95442ee6-0604-4c09-b597-a2c7d1f19985-inventory" (OuterVolumeSpecName: "inventory") pod "95442ee6-0604-4c09-b597-a2c7d1f19985" (UID: "95442ee6-0604-4c09-b597-a2c7d1f19985"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:30:40 crc kubenswrapper[4764]: I1001 16:30:40.038923 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95442ee6-0604-4c09-b597-a2c7d1f19985-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:30:40 crc kubenswrapper[4764]: I1001 16:30:40.038951 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9qz7\" (UniqueName: \"kubernetes.io/projected/95442ee6-0604-4c09-b597-a2c7d1f19985-kube-api-access-h9qz7\") on node \"crc\" DevicePath \"\"" Oct 01 16:30:40 crc kubenswrapper[4764]: I1001 16:30:40.038963 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/95442ee6-0604-4c09-b597-a2c7d1f19985-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:30:40 crc kubenswrapper[4764]: I1001 16:30:40.300460 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" event={"ID":"95442ee6-0604-4c09-b597-a2c7d1f19985","Type":"ContainerDied","Data":"2ed553f168d38629e1034258d0daa966eaab1f10ee18d4ee82db1d74b636a1ed"} Oct 01 16:30:40 crc kubenswrapper[4764]: I1001 16:30:40.300500 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8" Oct 01 16:30:40 crc kubenswrapper[4764]: I1001 16:30:40.300516 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ed553f168d38629e1034258d0daa966eaab1f10ee18d4ee82db1d74b636a1ed" Oct 01 16:30:45 crc kubenswrapper[4764]: I1001 16:30:45.936701 4764 scope.go:117] "RemoveContainer" containerID="70fb3d7826816ce50f8cd372843fa913b9ad4a73f6f38c5d71ac1f12f0328310" Oct 01 16:30:45 crc kubenswrapper[4764]: I1001 16:30:45.961716 4764 scope.go:117] "RemoveContainer" containerID="0b746ef53fd2437209b8aeb04cf0ceba9b7b17165435e9b4ac88d57c68206de2" Oct 01 16:30:46 crc kubenswrapper[4764]: I1001 16:30:46.004250 4764 scope.go:117] "RemoveContainer" containerID="6b83c48e9be8d4f73fd5389c6ae466fdc26fe6d5e3248013ac8a3df774e9504f" Oct 01 16:30:46 crc kubenswrapper[4764]: I1001 16:30:46.077540 4764 scope.go:117] "RemoveContainer" containerID="7cea4baac5c06c4f44ec89fbfc9cc94b99aa97a611ac96f33ca49f1e92ad6d36" Oct 01 16:30:46 crc kubenswrapper[4764]: I1001 16:30:46.100312 4764 scope.go:117] "RemoveContainer" containerID="c9b1c379e6188dbd099eba3b23671f6ec67044172ef7307a10e979cb87f578db" Oct 01 16:30:46 crc kubenswrapper[4764]: I1001 16:30:46.145484 4764 scope.go:117] "RemoveContainer" containerID="ece924ff63c46929ea0d32f523dea4e58ae3f16cc62d0a2d2ed1b6216573e35d" Oct 01 16:30:46 crc kubenswrapper[4764]: I1001 16:30:46.205036 4764 scope.go:117] "RemoveContainer" containerID="3c296db50d560a8194c5e3e898dea0151039b5fa1c97fe541c0da657dfe76449" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.028831 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft"] Oct 01 16:30:47 crc kubenswrapper[4764]: E1001 16:30:47.029517 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95442ee6-0604-4c09-b597-a2c7d1f19985" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.029540 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="95442ee6-0604-4c09-b597-a2c7d1f19985" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:30:47 crc kubenswrapper[4764]: E1001 16:30:47.029563 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ee5fea-bbee-4155-a586-fbc9af7e8608" containerName="collect-profiles" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.029569 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ee5fea-bbee-4155-a586-fbc9af7e8608" containerName="collect-profiles" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.029790 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="95442ee6-0604-4c09-b597-a2c7d1f19985" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.029805 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ee5fea-bbee-4155-a586-fbc9af7e8608" containerName="collect-profiles" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.030664 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.033234 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.033573 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.034590 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.034986 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.048012 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft"] Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.080380 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qmhb\" (UniqueName: \"kubernetes.io/projected/63959548-6ede-4c97-9c70-aeebb4cbff8b-kube-api-access-9qmhb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v56ft\" (UID: \"63959548-6ede-4c97-9c70-aeebb4cbff8b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.080474 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63959548-6ede-4c97-9c70-aeebb4cbff8b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v56ft\" (UID: \"63959548-6ede-4c97-9c70-aeebb4cbff8b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.080519 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63959548-6ede-4c97-9c70-aeebb4cbff8b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v56ft\" (UID: \"63959548-6ede-4c97-9c70-aeebb4cbff8b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.182855 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qmhb\" (UniqueName: \"kubernetes.io/projected/63959548-6ede-4c97-9c70-aeebb4cbff8b-kube-api-access-9qmhb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v56ft\" (UID: \"63959548-6ede-4c97-9c70-aeebb4cbff8b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.182954 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63959548-6ede-4c97-9c70-aeebb4cbff8b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v56ft\" (UID: \"63959548-6ede-4c97-9c70-aeebb4cbff8b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.182996 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63959548-6ede-4c97-9c70-aeebb4cbff8b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v56ft\" (UID: \"63959548-6ede-4c97-9c70-aeebb4cbff8b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.190527 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63959548-6ede-4c97-9c70-aeebb4cbff8b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v56ft\" (UID: \"63959548-6ede-4c97-9c70-aeebb4cbff8b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.190591 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63959548-6ede-4c97-9c70-aeebb4cbff8b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v56ft\" (UID: \"63959548-6ede-4c97-9c70-aeebb4cbff8b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.204665 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qmhb\" (UniqueName: \"kubernetes.io/projected/63959548-6ede-4c97-9c70-aeebb4cbff8b-kube-api-access-9qmhb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-v56ft\" (UID: \"63959548-6ede-4c97-9c70-aeebb4cbff8b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.358755 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.730120 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:30:47 crc kubenswrapper[4764]: E1001 16:30:47.730917 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:30:47 crc kubenswrapper[4764]: I1001 16:30:47.901323 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft"] Oct 01 16:30:48 crc kubenswrapper[4764]: I1001 16:30:48.396256 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" event={"ID":"63959548-6ede-4c97-9c70-aeebb4cbff8b","Type":"ContainerStarted","Data":"e511e3fd929999c8c5ae2508ab96bcbd62a20d28d0485fbcb9df3088fd764647"} Oct 01 16:30:49 crc kubenswrapper[4764]: I1001 16:30:49.405367 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" event={"ID":"63959548-6ede-4c97-9c70-aeebb4cbff8b","Type":"ContainerStarted","Data":"272fa0ab9e95ed0ca13f416d177c42f9576a06ffd3a3c0e369f0b1f5b959bc1b"} Oct 01 16:30:49 crc kubenswrapper[4764]: I1001 16:30:49.420299 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" podStartSLOduration=1.57223039 podStartE2EDuration="2.420274199s" podCreationTimestamp="2025-10-01 16:30:47 +0000 UTC" firstStartedPulling="2025-10-01 16:30:47.910488397 +0000 UTC m=+1710.910135232" lastFinishedPulling="2025-10-01 16:30:48.758532206 +0000 UTC m=+1711.758179041" observedRunningTime="2025-10-01 16:30:49.418545097 +0000 UTC m=+1712.418191952" watchObservedRunningTime="2025-10-01 16:30:49.420274199 +0000 UTC m=+1712.419921034" Oct 01 16:30:53 crc kubenswrapper[4764]: I1001 16:30:53.049614 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-af0f-account-create-xhcdp"] Oct 01 16:30:53 crc kubenswrapper[4764]: I1001 16:30:53.062245 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-af0f-account-create-xhcdp"] Oct 01 16:30:53 crc kubenswrapper[4764]: I1001 16:30:53.738117 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b789fde-2d6b-41ab-bdfb-8a3071d969f5" path="/var/lib/kubelet/pods/4b789fde-2d6b-41ab-bdfb-8a3071d969f5/volumes" Oct 01 16:31:01 crc kubenswrapper[4764]: I1001 16:31:01.722654 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:31:01 crc kubenswrapper[4764]: E1001 16:31:01.723489 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:31:03 crc kubenswrapper[4764]: I1001 16:31:03.032643 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dvptb"] Oct 01 16:31:03 crc kubenswrapper[4764]: I1001 16:31:03.042331 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dvptb"] Oct 01 16:31:03 crc kubenswrapper[4764]: I1001 16:31:03.738429 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0e44383-9ccd-4abc-9cce-aab97cce1388" path="/var/lib/kubelet/pods/b0e44383-9ccd-4abc-9cce-aab97cce1388/volumes" Oct 01 16:31:16 crc kubenswrapper[4764]: I1001 16:31:16.722338 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:31:16 crc kubenswrapper[4764]: E1001 16:31:16.724270 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:31:26 crc kubenswrapper[4764]: I1001 16:31:26.059345 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-sxv5x"] Oct 01 16:31:26 crc kubenswrapper[4764]: I1001 16:31:26.071348 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-sxv5x"] Oct 01 16:31:27 crc kubenswrapper[4764]: I1001 16:31:27.736828 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b99c5e9-3966-41d0-af69-2d0eb7a86d25" path="/var/lib/kubelet/pods/1b99c5e9-3966-41d0-af69-2d0eb7a86d25/volumes" Oct 01 16:31:28 crc kubenswrapper[4764]: I1001 16:31:28.037085 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r2sz8"] Oct 01 16:31:28 crc kubenswrapper[4764]: I1001 16:31:28.044201 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r2sz8"] Oct 01 16:31:29 crc kubenswrapper[4764]: I1001 16:31:29.722524 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:31:29 crc kubenswrapper[4764]: E1001 16:31:29.723550 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:31:29 crc kubenswrapper[4764]: I1001 16:31:29.739458 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6bfb577-6442-4db0-b962-a89441eb7a9c" path="/var/lib/kubelet/pods/c6bfb577-6442-4db0-b962-a89441eb7a9c/volumes" Oct 01 16:31:40 crc kubenswrapper[4764]: I1001 16:31:40.972216 4764 generic.go:334] "Generic (PLEG): container finished" podID="63959548-6ede-4c97-9c70-aeebb4cbff8b" containerID="272fa0ab9e95ed0ca13f416d177c42f9576a06ffd3a3c0e369f0b1f5b959bc1b" exitCode=0 Oct 01 16:31:40 crc kubenswrapper[4764]: I1001 16:31:40.972315 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" event={"ID":"63959548-6ede-4c97-9c70-aeebb4cbff8b","Type":"ContainerDied","Data":"272fa0ab9e95ed0ca13f416d177c42f9576a06ffd3a3c0e369f0b1f5b959bc1b"} Oct 01 16:31:42 crc kubenswrapper[4764]: I1001 16:31:42.425300 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" Oct 01 16:31:42 crc kubenswrapper[4764]: I1001 16:31:42.534033 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63959548-6ede-4c97-9c70-aeebb4cbff8b-inventory\") pod \"63959548-6ede-4c97-9c70-aeebb4cbff8b\" (UID: \"63959548-6ede-4c97-9c70-aeebb4cbff8b\") " Oct 01 16:31:42 crc kubenswrapper[4764]: I1001 16:31:42.534557 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qmhb\" (UniqueName: \"kubernetes.io/projected/63959548-6ede-4c97-9c70-aeebb4cbff8b-kube-api-access-9qmhb\") pod \"63959548-6ede-4c97-9c70-aeebb4cbff8b\" (UID: \"63959548-6ede-4c97-9c70-aeebb4cbff8b\") " Oct 01 16:31:42 crc kubenswrapper[4764]: I1001 16:31:42.534655 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63959548-6ede-4c97-9c70-aeebb4cbff8b-ssh-key\") pod \"63959548-6ede-4c97-9c70-aeebb4cbff8b\" (UID: \"63959548-6ede-4c97-9c70-aeebb4cbff8b\") " Oct 01 16:31:42 crc kubenswrapper[4764]: I1001 16:31:42.542192 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63959548-6ede-4c97-9c70-aeebb4cbff8b-kube-api-access-9qmhb" (OuterVolumeSpecName: "kube-api-access-9qmhb") pod "63959548-6ede-4c97-9c70-aeebb4cbff8b" (UID: "63959548-6ede-4c97-9c70-aeebb4cbff8b"). InnerVolumeSpecName "kube-api-access-9qmhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:31:42 crc kubenswrapper[4764]: I1001 16:31:42.565428 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63959548-6ede-4c97-9c70-aeebb4cbff8b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "63959548-6ede-4c97-9c70-aeebb4cbff8b" (UID: "63959548-6ede-4c97-9c70-aeebb4cbff8b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:31:42 crc kubenswrapper[4764]: I1001 16:31:42.568698 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63959548-6ede-4c97-9c70-aeebb4cbff8b-inventory" (OuterVolumeSpecName: "inventory") pod "63959548-6ede-4c97-9c70-aeebb4cbff8b" (UID: "63959548-6ede-4c97-9c70-aeebb4cbff8b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:31:42 crc kubenswrapper[4764]: I1001 16:31:42.637378 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63959548-6ede-4c97-9c70-aeebb4cbff8b-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:31:42 crc kubenswrapper[4764]: I1001 16:31:42.637576 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qmhb\" (UniqueName: \"kubernetes.io/projected/63959548-6ede-4c97-9c70-aeebb4cbff8b-kube-api-access-9qmhb\") on node \"crc\" DevicePath \"\"" Oct 01 16:31:42 crc kubenswrapper[4764]: I1001 16:31:42.637619 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63959548-6ede-4c97-9c70-aeebb4cbff8b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.025002 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" event={"ID":"63959548-6ede-4c97-9c70-aeebb4cbff8b","Type":"ContainerDied","Data":"e511e3fd929999c8c5ae2508ab96bcbd62a20d28d0485fbcb9df3088fd764647"} Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.025089 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e511e3fd929999c8c5ae2508ab96bcbd62a20d28d0485fbcb9df3088fd764647" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.025174 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.126366 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-tr4bk"] Oct 01 16:31:43 crc kubenswrapper[4764]: E1001 16:31:43.129392 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63959548-6ede-4c97-9c70-aeebb4cbff8b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.129639 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="63959548-6ede-4c97-9c70-aeebb4cbff8b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.132287 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="63959548-6ede-4c97-9c70-aeebb4cbff8b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.133083 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.134766 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-tr4bk"] Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.136354 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.136708 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.136721 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.139521 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.253685 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f803c1c5-bacb-4bb4-8c19-813f8c012625-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-tr4bk\" (UID: \"f803c1c5-bacb-4bb4-8c19-813f8c012625\") " pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.253776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f803c1c5-bacb-4bb4-8c19-813f8c012625-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-tr4bk\" (UID: \"f803c1c5-bacb-4bb4-8c19-813f8c012625\") " pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.253997 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4jzv\" (UniqueName: \"kubernetes.io/projected/f803c1c5-bacb-4bb4-8c19-813f8c012625-kube-api-access-v4jzv\") pod \"ssh-known-hosts-edpm-deployment-tr4bk\" (UID: \"f803c1c5-bacb-4bb4-8c19-813f8c012625\") " pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.356100 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4jzv\" (UniqueName: \"kubernetes.io/projected/f803c1c5-bacb-4bb4-8c19-813f8c012625-kube-api-access-v4jzv\") pod \"ssh-known-hosts-edpm-deployment-tr4bk\" (UID: \"f803c1c5-bacb-4bb4-8c19-813f8c012625\") " pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.356297 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f803c1c5-bacb-4bb4-8c19-813f8c012625-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-tr4bk\" (UID: \"f803c1c5-bacb-4bb4-8c19-813f8c012625\") " pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.356352 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f803c1c5-bacb-4bb4-8c19-813f8c012625-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-tr4bk\" (UID: \"f803c1c5-bacb-4bb4-8c19-813f8c012625\") " pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.367495 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f803c1c5-bacb-4bb4-8c19-813f8c012625-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-tr4bk\" (UID: \"f803c1c5-bacb-4bb4-8c19-813f8c012625\") " pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.369896 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f803c1c5-bacb-4bb4-8c19-813f8c012625-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-tr4bk\" (UID: \"f803c1c5-bacb-4bb4-8c19-813f8c012625\") " pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.391490 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4jzv\" (UniqueName: \"kubernetes.io/projected/f803c1c5-bacb-4bb4-8c19-813f8c012625-kube-api-access-v4jzv\") pod \"ssh-known-hosts-edpm-deployment-tr4bk\" (UID: \"f803c1c5-bacb-4bb4-8c19-813f8c012625\") " pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.455356 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" Oct 01 16:31:43 crc kubenswrapper[4764]: I1001 16:31:43.722451 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:31:43 crc kubenswrapper[4764]: E1001 16:31:43.723222 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:31:44 crc kubenswrapper[4764]: I1001 16:31:44.032271 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-tr4bk"] Oct 01 16:31:45 crc kubenswrapper[4764]: I1001 16:31:45.062041 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" event={"ID":"f803c1c5-bacb-4bb4-8c19-813f8c012625","Type":"ContainerStarted","Data":"82a1cec33b73cf1a610c0c98aa69ac0cee97502e8599323923b3d94275087e8a"} Oct 01 16:31:45 crc kubenswrapper[4764]: I1001 16:31:45.062362 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" event={"ID":"f803c1c5-bacb-4bb4-8c19-813f8c012625","Type":"ContainerStarted","Data":"8fd89eb25947a451777ac609334e718d0d96ef2b82b6753393fa53f00db4015c"} Oct 01 16:31:45 crc kubenswrapper[4764]: I1001 16:31:45.094808 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" podStartSLOduration=1.594696347 podStartE2EDuration="2.094776603s" podCreationTimestamp="2025-10-01 16:31:43 +0000 UTC" firstStartedPulling="2025-10-01 16:31:44.039699672 +0000 UTC m=+1767.039346547" lastFinishedPulling="2025-10-01 16:31:44.539779968 +0000 UTC m=+1767.539426803" observedRunningTime="2025-10-01 16:31:45.080643066 +0000 UTC m=+1768.080289941" watchObservedRunningTime="2025-10-01 16:31:45.094776603 +0000 UTC m=+1768.094423448" Oct 01 16:31:46 crc kubenswrapper[4764]: I1001 16:31:46.454201 4764 scope.go:117] "RemoveContainer" containerID="bd63c8eafdb2a5f37ac4d05c15c3e6fee2e9a97cb0ab101b9005fc7db2ee1a4a" Oct 01 16:31:46 crc kubenswrapper[4764]: I1001 16:31:46.501826 4764 scope.go:117] "RemoveContainer" containerID="1a80f7f6f8927cfdb5534a5d0fc06b5b6fbfdd26e2766c8df9b54722bcec1a8d" Oct 01 16:31:46 crc kubenswrapper[4764]: I1001 16:31:46.598898 4764 scope.go:117] "RemoveContainer" containerID="e93aa8dcb65cb5085dc7f7cd9bbe91cf5fd98006535dba31bd46aa42e7804e4b" Oct 01 16:31:46 crc kubenswrapper[4764]: I1001 16:31:46.648635 4764 scope.go:117] "RemoveContainer" containerID="e164c82ca96c5f7cedee0ab66c69a19c4a1fe02bcb5a45e9eb658c45143c73f3" Oct 01 16:31:52 crc kubenswrapper[4764]: I1001 16:31:52.132856 4764 generic.go:334] "Generic (PLEG): container finished" podID="f803c1c5-bacb-4bb4-8c19-813f8c012625" containerID="82a1cec33b73cf1a610c0c98aa69ac0cee97502e8599323923b3d94275087e8a" exitCode=0 Oct 01 16:31:52 crc kubenswrapper[4764]: I1001 16:31:52.132958 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" event={"ID":"f803c1c5-bacb-4bb4-8c19-813f8c012625","Type":"ContainerDied","Data":"82a1cec33b73cf1a610c0c98aa69ac0cee97502e8599323923b3d94275087e8a"} Oct 01 16:31:53 crc kubenswrapper[4764]: I1001 16:31:53.630722 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" Oct 01 16:31:53 crc kubenswrapper[4764]: I1001 16:31:53.798428 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f803c1c5-bacb-4bb4-8c19-813f8c012625-ssh-key-openstack-edpm-ipam\") pod \"f803c1c5-bacb-4bb4-8c19-813f8c012625\" (UID: \"f803c1c5-bacb-4bb4-8c19-813f8c012625\") " Oct 01 16:31:53 crc kubenswrapper[4764]: I1001 16:31:53.798704 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f803c1c5-bacb-4bb4-8c19-813f8c012625-inventory-0\") pod \"f803c1c5-bacb-4bb4-8c19-813f8c012625\" (UID: \"f803c1c5-bacb-4bb4-8c19-813f8c012625\") " Oct 01 16:31:53 crc kubenswrapper[4764]: I1001 16:31:53.798751 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4jzv\" (UniqueName: \"kubernetes.io/projected/f803c1c5-bacb-4bb4-8c19-813f8c012625-kube-api-access-v4jzv\") pod \"f803c1c5-bacb-4bb4-8c19-813f8c012625\" (UID: \"f803c1c5-bacb-4bb4-8c19-813f8c012625\") " Oct 01 16:31:53 crc kubenswrapper[4764]: I1001 16:31:53.809930 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f803c1c5-bacb-4bb4-8c19-813f8c012625-kube-api-access-v4jzv" (OuterVolumeSpecName: "kube-api-access-v4jzv") pod "f803c1c5-bacb-4bb4-8c19-813f8c012625" (UID: "f803c1c5-bacb-4bb4-8c19-813f8c012625"). InnerVolumeSpecName "kube-api-access-v4jzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:31:53 crc kubenswrapper[4764]: I1001 16:31:53.824879 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f803c1c5-bacb-4bb4-8c19-813f8c012625-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f803c1c5-bacb-4bb4-8c19-813f8c012625" (UID: "f803c1c5-bacb-4bb4-8c19-813f8c012625"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:31:53 crc kubenswrapper[4764]: I1001 16:31:53.831198 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f803c1c5-bacb-4bb4-8c19-813f8c012625-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f803c1c5-bacb-4bb4-8c19-813f8c012625" (UID: "f803c1c5-bacb-4bb4-8c19-813f8c012625"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:31:53 crc kubenswrapper[4764]: I1001 16:31:53.902013 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f803c1c5-bacb-4bb4-8c19-813f8c012625-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 16:31:53 crc kubenswrapper[4764]: I1001 16:31:53.902317 4764 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f803c1c5-bacb-4bb4-8c19-813f8c012625-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 01 16:31:53 crc kubenswrapper[4764]: I1001 16:31:53.902398 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4jzv\" (UniqueName: \"kubernetes.io/projected/f803c1c5-bacb-4bb4-8c19-813f8c012625-kube-api-access-v4jzv\") on node \"crc\" DevicePath \"\"" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.160772 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" event={"ID":"f803c1c5-bacb-4bb4-8c19-813f8c012625","Type":"ContainerDied","Data":"8fd89eb25947a451777ac609334e718d0d96ef2b82b6753393fa53f00db4015c"} Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.160828 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fd89eb25947a451777ac609334e718d0d96ef2b82b6753393fa53f00db4015c" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.160905 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-tr4bk" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.250535 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl"] Oct 01 16:31:54 crc kubenswrapper[4764]: E1001 16:31:54.251326 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f803c1c5-bacb-4bb4-8c19-813f8c012625" containerName="ssh-known-hosts-edpm-deployment" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.251361 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f803c1c5-bacb-4bb4-8c19-813f8c012625" containerName="ssh-known-hosts-edpm-deployment" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.251733 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f803c1c5-bacb-4bb4-8c19-813f8c012625" containerName="ssh-known-hosts-edpm-deployment" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.252772 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.255554 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.256243 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.256427 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.257480 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.275209 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl"] Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.412404 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnj9c\" (UniqueName: \"kubernetes.io/projected/1612b417-4f29-4c05-ab4b-c3d7f193a17c-kube-api-access-pnj9c\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bvcsl\" (UID: \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.412518 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1612b417-4f29-4c05-ab4b-c3d7f193a17c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bvcsl\" (UID: \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.412587 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1612b417-4f29-4c05-ab4b-c3d7f193a17c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bvcsl\" (UID: \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.514655 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnj9c\" (UniqueName: \"kubernetes.io/projected/1612b417-4f29-4c05-ab4b-c3d7f193a17c-kube-api-access-pnj9c\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bvcsl\" (UID: \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.514867 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1612b417-4f29-4c05-ab4b-c3d7f193a17c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bvcsl\" (UID: \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.514961 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1612b417-4f29-4c05-ab4b-c3d7f193a17c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bvcsl\" (UID: \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.523258 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1612b417-4f29-4c05-ab4b-c3d7f193a17c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bvcsl\" (UID: \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.528965 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1612b417-4f29-4c05-ab4b-c3d7f193a17c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bvcsl\" (UID: \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.555416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnj9c\" (UniqueName: \"kubernetes.io/projected/1612b417-4f29-4c05-ab4b-c3d7f193a17c-kube-api-access-pnj9c\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bvcsl\" (UID: \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" Oct 01 16:31:54 crc kubenswrapper[4764]: I1001 16:31:54.584138 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" Oct 01 16:31:55 crc kubenswrapper[4764]: I1001 16:31:55.264712 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl"] Oct 01 16:31:56 crc kubenswrapper[4764]: I1001 16:31:56.194217 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" event={"ID":"1612b417-4f29-4c05-ab4b-c3d7f193a17c","Type":"ContainerStarted","Data":"ae316cc9e948a938c3b8b8f6bfad61208acfc3fd7b7d125787f8086f8574a0f6"} Oct 01 16:31:58 crc kubenswrapper[4764]: I1001 16:31:58.721679 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:31:58 crc kubenswrapper[4764]: E1001 16:31:58.722806 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:31:59 crc kubenswrapper[4764]: I1001 16:31:59.231411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" event={"ID":"1612b417-4f29-4c05-ab4b-c3d7f193a17c","Type":"ContainerStarted","Data":"6df8c1aae9178e046c4a55faacd040477294c548d5feb7d84f245648218a3866"} Oct 01 16:31:59 crc kubenswrapper[4764]: I1001 16:31:59.271914 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" podStartSLOduration=2.508110162 podStartE2EDuration="5.271889195s" podCreationTimestamp="2025-10-01 16:31:54 +0000 UTC" firstStartedPulling="2025-10-01 16:31:55.269452638 +0000 UTC m=+1778.269099483" lastFinishedPulling="2025-10-01 16:31:58.033231641 +0000 UTC m=+1781.032878516" observedRunningTime="2025-10-01 16:31:59.261354557 +0000 UTC m=+1782.261001462" watchObservedRunningTime="2025-10-01 16:31:59.271889195 +0000 UTC m=+1782.271536070" Oct 01 16:32:07 crc kubenswrapper[4764]: I1001 16:32:07.325727 4764 generic.go:334] "Generic (PLEG): container finished" podID="1612b417-4f29-4c05-ab4b-c3d7f193a17c" containerID="6df8c1aae9178e046c4a55faacd040477294c548d5feb7d84f245648218a3866" exitCode=0 Oct 01 16:32:07 crc kubenswrapper[4764]: I1001 16:32:07.325848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" event={"ID":"1612b417-4f29-4c05-ab4b-c3d7f193a17c","Type":"ContainerDied","Data":"6df8c1aae9178e046c4a55faacd040477294c548d5feb7d84f245648218a3866"} Oct 01 16:32:08 crc kubenswrapper[4764]: I1001 16:32:08.814004 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" Oct 01 16:32:08 crc kubenswrapper[4764]: I1001 16:32:08.961954 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1612b417-4f29-4c05-ab4b-c3d7f193a17c-ssh-key\") pod \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\" (UID: \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\") " Oct 01 16:32:08 crc kubenswrapper[4764]: I1001 16:32:08.962076 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1612b417-4f29-4c05-ab4b-c3d7f193a17c-inventory\") pod \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\" (UID: \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\") " Oct 01 16:32:08 crc kubenswrapper[4764]: I1001 16:32:08.962204 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnj9c\" (UniqueName: \"kubernetes.io/projected/1612b417-4f29-4c05-ab4b-c3d7f193a17c-kube-api-access-pnj9c\") pod \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\" (UID: \"1612b417-4f29-4c05-ab4b-c3d7f193a17c\") " Oct 01 16:32:08 crc kubenswrapper[4764]: I1001 16:32:08.973443 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1612b417-4f29-4c05-ab4b-c3d7f193a17c-kube-api-access-pnj9c" (OuterVolumeSpecName: "kube-api-access-pnj9c") pod "1612b417-4f29-4c05-ab4b-c3d7f193a17c" (UID: "1612b417-4f29-4c05-ab4b-c3d7f193a17c"). InnerVolumeSpecName "kube-api-access-pnj9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:32:08 crc kubenswrapper[4764]: I1001 16:32:08.997471 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1612b417-4f29-4c05-ab4b-c3d7f193a17c-inventory" (OuterVolumeSpecName: "inventory") pod "1612b417-4f29-4c05-ab4b-c3d7f193a17c" (UID: "1612b417-4f29-4c05-ab4b-c3d7f193a17c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.010685 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1612b417-4f29-4c05-ab4b-c3d7f193a17c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1612b417-4f29-4c05-ab4b-c3d7f193a17c" (UID: "1612b417-4f29-4c05-ab4b-c3d7f193a17c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.065723 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1612b417-4f29-4c05-ab4b-c3d7f193a17c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.065774 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1612b417-4f29-4c05-ab4b-c3d7f193a17c-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.065796 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnj9c\" (UniqueName: \"kubernetes.io/projected/1612b417-4f29-4c05-ab4b-c3d7f193a17c-kube-api-access-pnj9c\") on node \"crc\" DevicePath \"\"" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.351535 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" event={"ID":"1612b417-4f29-4c05-ab4b-c3d7f193a17c","Type":"ContainerDied","Data":"ae316cc9e948a938c3b8b8f6bfad61208acfc3fd7b7d125787f8086f8574a0f6"} Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.351936 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae316cc9e948a938c3b8b8f6bfad61208acfc3fd7b7d125787f8086f8574a0f6" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.351642 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.449581 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g"] Oct 01 16:32:09 crc kubenswrapper[4764]: E1001 16:32:09.450179 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1612b417-4f29-4c05-ab4b-c3d7f193a17c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.450203 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1612b417-4f29-4c05-ab4b-c3d7f193a17c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.450500 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1612b417-4f29-4c05-ab4b-c3d7f193a17c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.451272 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.455597 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.456000 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.456716 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.456961 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.466408 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g"] Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.575009 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqzxl\" (UniqueName: \"kubernetes.io/projected/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-kube-api-access-mqzxl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g\" (UID: \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.575310 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g\" (UID: \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.575462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g\" (UID: \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.677343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqzxl\" (UniqueName: \"kubernetes.io/projected/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-kube-api-access-mqzxl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g\" (UID: \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.677457 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g\" (UID: \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.677610 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g\" (UID: \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.685642 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g\" (UID: \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.685897 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g\" (UID: \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.704760 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqzxl\" (UniqueName: \"kubernetes.io/projected/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-kube-api-access-mqzxl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g\" (UID: \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" Oct 01 16:32:09 crc kubenswrapper[4764]: I1001 16:32:09.769564 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" Oct 01 16:32:10 crc kubenswrapper[4764]: I1001 16:32:10.065711 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-plxmm"] Oct 01 16:32:10 crc kubenswrapper[4764]: I1001 16:32:10.076632 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-plxmm"] Oct 01 16:32:10 crc kubenswrapper[4764]: W1001 16:32:10.165025 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b3a4323_b2aa_4b79_a155_6485ec72fe1f.slice/crio-e96f853955d3313ac261760b050e30d7f4eca9fa5df603212b5a51d6c2932d9c WatchSource:0}: Error finding container e96f853955d3313ac261760b050e30d7f4eca9fa5df603212b5a51d6c2932d9c: Status 404 returned error can't find the container with id e96f853955d3313ac261760b050e30d7f4eca9fa5df603212b5a51d6c2932d9c Oct 01 16:32:10 crc kubenswrapper[4764]: I1001 16:32:10.168442 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g"] Oct 01 16:32:10 crc kubenswrapper[4764]: I1001 16:32:10.362918 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" event={"ID":"7b3a4323-b2aa-4b79-a155-6485ec72fe1f","Type":"ContainerStarted","Data":"e96f853955d3313ac261760b050e30d7f4eca9fa5df603212b5a51d6c2932d9c"} Oct 01 16:32:11 crc kubenswrapper[4764]: I1001 16:32:11.721841 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:32:11 crc kubenswrapper[4764]: E1001 16:32:11.722447 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:32:11 crc kubenswrapper[4764]: I1001 16:32:11.736290 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bfc22be-0529-4b69-b782-c21bdd4fdaa6" path="/var/lib/kubelet/pods/5bfc22be-0529-4b69-b782-c21bdd4fdaa6/volumes" Oct 01 16:32:12 crc kubenswrapper[4764]: I1001 16:32:12.387515 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" event={"ID":"7b3a4323-b2aa-4b79-a155-6485ec72fe1f","Type":"ContainerStarted","Data":"40c706c7ecd0f3dcd1a801fd0481d9e4ee2bdb09b983e3f68f19611b1d3bbb78"} Oct 01 16:32:12 crc kubenswrapper[4764]: I1001 16:32:12.410496 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" podStartSLOduration=2.328417881 podStartE2EDuration="3.410466283s" podCreationTimestamp="2025-10-01 16:32:09 +0000 UTC" firstStartedPulling="2025-10-01 16:32:10.168172231 +0000 UTC m=+1793.167819066" lastFinishedPulling="2025-10-01 16:32:11.250220593 +0000 UTC m=+1794.249867468" observedRunningTime="2025-10-01 16:32:12.406580318 +0000 UTC m=+1795.406227163" watchObservedRunningTime="2025-10-01 16:32:12.410466283 +0000 UTC m=+1795.410113138" Oct 01 16:32:21 crc kubenswrapper[4764]: I1001 16:32:21.501180 4764 generic.go:334] "Generic (PLEG): container finished" podID="7b3a4323-b2aa-4b79-a155-6485ec72fe1f" containerID="40c706c7ecd0f3dcd1a801fd0481d9e4ee2bdb09b983e3f68f19611b1d3bbb78" exitCode=0 Oct 01 16:32:21 crc kubenswrapper[4764]: I1001 16:32:21.501250 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" event={"ID":"7b3a4323-b2aa-4b79-a155-6485ec72fe1f","Type":"ContainerDied","Data":"40c706c7ecd0f3dcd1a801fd0481d9e4ee2bdb09b983e3f68f19611b1d3bbb78"} Oct 01 16:32:22 crc kubenswrapper[4764]: I1001 16:32:22.722065 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:32:22 crc kubenswrapper[4764]: E1001 16:32:22.722575 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:32:22 crc kubenswrapper[4764]: I1001 16:32:22.933315 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" Oct 01 16:32:22 crc kubenswrapper[4764]: I1001 16:32:22.954660 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-ssh-key\") pod \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\" (UID: \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\") " Oct 01 16:32:22 crc kubenswrapper[4764]: I1001 16:32:22.954917 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-inventory\") pod \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\" (UID: \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\") " Oct 01 16:32:22 crc kubenswrapper[4764]: I1001 16:32:22.955238 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqzxl\" (UniqueName: \"kubernetes.io/projected/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-kube-api-access-mqzxl\") pod \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\" (UID: \"7b3a4323-b2aa-4b79-a155-6485ec72fe1f\") " Oct 01 16:32:22 crc kubenswrapper[4764]: I1001 16:32:22.967512 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-kube-api-access-mqzxl" (OuterVolumeSpecName: "kube-api-access-mqzxl") pod "7b3a4323-b2aa-4b79-a155-6485ec72fe1f" (UID: "7b3a4323-b2aa-4b79-a155-6485ec72fe1f"). InnerVolumeSpecName "kube-api-access-mqzxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:32:22 crc kubenswrapper[4764]: I1001 16:32:22.980624 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7b3a4323-b2aa-4b79-a155-6485ec72fe1f" (UID: "7b3a4323-b2aa-4b79-a155-6485ec72fe1f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:32:22 crc kubenswrapper[4764]: I1001 16:32:22.988812 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-inventory" (OuterVolumeSpecName: "inventory") pod "7b3a4323-b2aa-4b79-a155-6485ec72fe1f" (UID: "7b3a4323-b2aa-4b79-a155-6485ec72fe1f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:32:23 crc kubenswrapper[4764]: I1001 16:32:23.059196 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:32:23 crc kubenswrapper[4764]: I1001 16:32:23.059254 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:32:23 crc kubenswrapper[4764]: I1001 16:32:23.059268 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqzxl\" (UniqueName: \"kubernetes.io/projected/7b3a4323-b2aa-4b79-a155-6485ec72fe1f-kube-api-access-mqzxl\") on node \"crc\" DevicePath \"\"" Oct 01 16:32:23 crc kubenswrapper[4764]: I1001 16:32:23.522176 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" event={"ID":"7b3a4323-b2aa-4b79-a155-6485ec72fe1f","Type":"ContainerDied","Data":"e96f853955d3313ac261760b050e30d7f4eca9fa5df603212b5a51d6c2932d9c"} Oct 01 16:32:23 crc kubenswrapper[4764]: I1001 16:32:23.522218 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e96f853955d3313ac261760b050e30d7f4eca9fa5df603212b5a51d6c2932d9c" Oct 01 16:32:23 crc kubenswrapper[4764]: I1001 16:32:23.522260 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g" Oct 01 16:32:33 crc kubenswrapper[4764]: I1001 16:32:33.722303 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:32:33 crc kubenswrapper[4764]: E1001 16:32:33.723391 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:32:46 crc kubenswrapper[4764]: I1001 16:32:46.722016 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:32:46 crc kubenswrapper[4764]: E1001 16:32:46.723125 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:32:46 crc kubenswrapper[4764]: I1001 16:32:46.767800 4764 scope.go:117] "RemoveContainer" containerID="0b10634bad5c7a3310f86837fad12ef9e8f0631d15248903281ab255943c2654" Oct 01 16:33:01 crc kubenswrapper[4764]: I1001 16:33:01.722826 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:33:01 crc kubenswrapper[4764]: E1001 16:33:01.723831 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:33:13 crc kubenswrapper[4764]: I1001 16:33:13.721629 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:33:13 crc kubenswrapper[4764]: E1001 16:33:13.722684 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:33:28 crc kubenswrapper[4764]: I1001 16:33:28.722216 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:33:28 crc kubenswrapper[4764]: E1001 16:33:28.723217 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:33:43 crc kubenswrapper[4764]: I1001 16:33:43.722297 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:33:43 crc kubenswrapper[4764]: E1001 16:33:43.723109 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:33:46 crc kubenswrapper[4764]: I1001 16:33:46.875943 4764 scope.go:117] "RemoveContainer" containerID="08138863923be7b2ac1dd93bc5e1f2422461d6ce0da0271687ff1681c757d747" Oct 01 16:33:54 crc kubenswrapper[4764]: I1001 16:33:54.722791 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:33:54 crc kubenswrapper[4764]: E1001 16:33:54.723894 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:34:08 crc kubenswrapper[4764]: I1001 16:34:08.722426 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:34:08 crc kubenswrapper[4764]: E1001 16:34:08.723530 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:34:22 crc kubenswrapper[4764]: I1001 16:34:22.721904 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:34:22 crc kubenswrapper[4764]: E1001 16:34:22.722699 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:34:37 crc kubenswrapper[4764]: I1001 16:34:37.728299 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:34:37 crc kubenswrapper[4764]: E1001 16:34:37.729379 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:34:52 crc kubenswrapper[4764]: I1001 16:34:52.722147 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:34:52 crc kubenswrapper[4764]: E1001 16:34:52.723177 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:35:03 crc kubenswrapper[4764]: I1001 16:35:03.722523 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:35:03 crc kubenswrapper[4764]: E1001 16:35:03.723584 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:35:16 crc kubenswrapper[4764]: I1001 16:35:16.722242 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:35:16 crc kubenswrapper[4764]: E1001 16:35:16.723431 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:35:31 crc kubenswrapper[4764]: I1001 16:35:31.722385 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:35:32 crc kubenswrapper[4764]: I1001 16:35:32.534352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"d0b778d1376a46a6d5f3e2b028c1b5a6ac86778ba307f59878807e5928f0d1b7"} Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.490935 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5769r"] Oct 01 16:36:13 crc kubenswrapper[4764]: E1001 16:36:13.497907 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3a4323-b2aa-4b79-a155-6485ec72fe1f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.497953 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3a4323-b2aa-4b79-a155-6485ec72fe1f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.498841 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b3a4323-b2aa-4b79-a155-6485ec72fe1f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.509259 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.537852 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5769r"] Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.584968 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp9wf\" (UniqueName: \"kubernetes.io/projected/38a53cf3-3884-4942-9d26-80266ece174e-kube-api-access-hp9wf\") pod \"redhat-operators-5769r\" (UID: \"38a53cf3-3884-4942-9d26-80266ece174e\") " pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.585093 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a53cf3-3884-4942-9d26-80266ece174e-catalog-content\") pod \"redhat-operators-5769r\" (UID: \"38a53cf3-3884-4942-9d26-80266ece174e\") " pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.585182 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a53cf3-3884-4942-9d26-80266ece174e-utilities\") pod \"redhat-operators-5769r\" (UID: \"38a53cf3-3884-4942-9d26-80266ece174e\") " pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.686813 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp9wf\" (UniqueName: \"kubernetes.io/projected/38a53cf3-3884-4942-9d26-80266ece174e-kube-api-access-hp9wf\") pod \"redhat-operators-5769r\" (UID: \"38a53cf3-3884-4942-9d26-80266ece174e\") " pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.686937 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a53cf3-3884-4942-9d26-80266ece174e-catalog-content\") pod \"redhat-operators-5769r\" (UID: \"38a53cf3-3884-4942-9d26-80266ece174e\") " pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.686991 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a53cf3-3884-4942-9d26-80266ece174e-utilities\") pod \"redhat-operators-5769r\" (UID: \"38a53cf3-3884-4942-9d26-80266ece174e\") " pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.687510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a53cf3-3884-4942-9d26-80266ece174e-catalog-content\") pod \"redhat-operators-5769r\" (UID: \"38a53cf3-3884-4942-9d26-80266ece174e\") " pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.687571 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a53cf3-3884-4942-9d26-80266ece174e-utilities\") pod \"redhat-operators-5769r\" (UID: \"38a53cf3-3884-4942-9d26-80266ece174e\") " pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.714882 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp9wf\" (UniqueName: \"kubernetes.io/projected/38a53cf3-3884-4942-9d26-80266ece174e-kube-api-access-hp9wf\") pod \"redhat-operators-5769r\" (UID: \"38a53cf3-3884-4942-9d26-80266ece174e\") " pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:13 crc kubenswrapper[4764]: I1001 16:36:13.839016 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:14 crc kubenswrapper[4764]: I1001 16:36:14.286668 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5769r"] Oct 01 16:36:15 crc kubenswrapper[4764]: I1001 16:36:15.016965 4764 generic.go:334] "Generic (PLEG): container finished" podID="38a53cf3-3884-4942-9d26-80266ece174e" containerID="c1aa8fb12965ed92ebbb8915d7005d0e1977f885952eb602864245e4b359781d" exitCode=0 Oct 01 16:36:15 crc kubenswrapper[4764]: I1001 16:36:15.017060 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5769r" event={"ID":"38a53cf3-3884-4942-9d26-80266ece174e","Type":"ContainerDied","Data":"c1aa8fb12965ed92ebbb8915d7005d0e1977f885952eb602864245e4b359781d"} Oct 01 16:36:15 crc kubenswrapper[4764]: I1001 16:36:15.017349 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5769r" event={"ID":"38a53cf3-3884-4942-9d26-80266ece174e","Type":"ContainerStarted","Data":"0c20e29a88f6e41425f3b4f681be601152ef3559981768b9802777c859877393"} Oct 01 16:36:15 crc kubenswrapper[4764]: I1001 16:36:15.019176 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 16:36:17 crc kubenswrapper[4764]: I1001 16:36:17.037515 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5769r" event={"ID":"38a53cf3-3884-4942-9d26-80266ece174e","Type":"ContainerStarted","Data":"d11cdfab05337aed235015f68ff83a39a765d33bcb95c6d55645066eecffaff2"} Oct 01 16:36:21 crc kubenswrapper[4764]: I1001 16:36:21.106741 4764 generic.go:334] "Generic (PLEG): container finished" podID="38a53cf3-3884-4942-9d26-80266ece174e" containerID="d11cdfab05337aed235015f68ff83a39a765d33bcb95c6d55645066eecffaff2" exitCode=0 Oct 01 16:36:21 crc kubenswrapper[4764]: I1001 16:36:21.107268 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5769r" event={"ID":"38a53cf3-3884-4942-9d26-80266ece174e","Type":"ContainerDied","Data":"d11cdfab05337aed235015f68ff83a39a765d33bcb95c6d55645066eecffaff2"} Oct 01 16:36:22 crc kubenswrapper[4764]: I1001 16:36:22.120083 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5769r" event={"ID":"38a53cf3-3884-4942-9d26-80266ece174e","Type":"ContainerStarted","Data":"66683edcde89f39248e1635057f00dfe78d4f0865e982c77ee8557cc8453c7e6"} Oct 01 16:36:22 crc kubenswrapper[4764]: I1001 16:36:22.149498 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5769r" podStartSLOduration=2.546120057 podStartE2EDuration="9.149481151s" podCreationTimestamp="2025-10-01 16:36:13 +0000 UTC" firstStartedPulling="2025-10-01 16:36:15.01890844 +0000 UTC m=+2038.018555275" lastFinishedPulling="2025-10-01 16:36:21.622269534 +0000 UTC m=+2044.621916369" observedRunningTime="2025-10-01 16:36:22.139374193 +0000 UTC m=+2045.139021038" watchObservedRunningTime="2025-10-01 16:36:22.149481151 +0000 UTC m=+2045.149127986" Oct 01 16:36:23 crc kubenswrapper[4764]: I1001 16:36:23.840200 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:23 crc kubenswrapper[4764]: I1001 16:36:23.840532 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:24 crc kubenswrapper[4764]: I1001 16:36:24.890496 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5769r" podUID="38a53cf3-3884-4942-9d26-80266ece174e" containerName="registry-server" probeResult="failure" output=< Oct 01 16:36:24 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Oct 01 16:36:24 crc kubenswrapper[4764]: > Oct 01 16:36:33 crc kubenswrapper[4764]: I1001 16:36:33.906195 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:33 crc kubenswrapper[4764]: I1001 16:36:33.963649 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:34 crc kubenswrapper[4764]: I1001 16:36:34.156850 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5769r"] Oct 01 16:36:35 crc kubenswrapper[4764]: I1001 16:36:35.238165 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5769r" podUID="38a53cf3-3884-4942-9d26-80266ece174e" containerName="registry-server" containerID="cri-o://66683edcde89f39248e1635057f00dfe78d4f0865e982c77ee8557cc8453c7e6" gracePeriod=2 Oct 01 16:36:35 crc kubenswrapper[4764]: I1001 16:36:35.673370 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:35 crc kubenswrapper[4764]: I1001 16:36:35.714857 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a53cf3-3884-4942-9d26-80266ece174e-catalog-content\") pod \"38a53cf3-3884-4942-9d26-80266ece174e\" (UID: \"38a53cf3-3884-4942-9d26-80266ece174e\") " Oct 01 16:36:35 crc kubenswrapper[4764]: I1001 16:36:35.715173 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp9wf\" (UniqueName: \"kubernetes.io/projected/38a53cf3-3884-4942-9d26-80266ece174e-kube-api-access-hp9wf\") pod \"38a53cf3-3884-4942-9d26-80266ece174e\" (UID: \"38a53cf3-3884-4942-9d26-80266ece174e\") " Oct 01 16:36:35 crc kubenswrapper[4764]: I1001 16:36:35.715246 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a53cf3-3884-4942-9d26-80266ece174e-utilities\") pod \"38a53cf3-3884-4942-9d26-80266ece174e\" (UID: \"38a53cf3-3884-4942-9d26-80266ece174e\") " Oct 01 16:36:35 crc kubenswrapper[4764]: I1001 16:36:35.716585 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38a53cf3-3884-4942-9d26-80266ece174e-utilities" (OuterVolumeSpecName: "utilities") pod "38a53cf3-3884-4942-9d26-80266ece174e" (UID: "38a53cf3-3884-4942-9d26-80266ece174e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:36:35 crc kubenswrapper[4764]: I1001 16:36:35.726293 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a53cf3-3884-4942-9d26-80266ece174e-kube-api-access-hp9wf" (OuterVolumeSpecName: "kube-api-access-hp9wf") pod "38a53cf3-3884-4942-9d26-80266ece174e" (UID: "38a53cf3-3884-4942-9d26-80266ece174e"). InnerVolumeSpecName "kube-api-access-hp9wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:36:35 crc kubenswrapper[4764]: I1001 16:36:35.808444 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38a53cf3-3884-4942-9d26-80266ece174e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38a53cf3-3884-4942-9d26-80266ece174e" (UID: "38a53cf3-3884-4942-9d26-80266ece174e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:36:35 crc kubenswrapper[4764]: I1001 16:36:35.817583 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp9wf\" (UniqueName: \"kubernetes.io/projected/38a53cf3-3884-4942-9d26-80266ece174e-kube-api-access-hp9wf\") on node \"crc\" DevicePath \"\"" Oct 01 16:36:35 crc kubenswrapper[4764]: I1001 16:36:35.817617 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a53cf3-3884-4942-9d26-80266ece174e-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:36:35 crc kubenswrapper[4764]: I1001 16:36:35.817633 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a53cf3-3884-4942-9d26-80266ece174e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.251292 4764 generic.go:334] "Generic (PLEG): container finished" podID="38a53cf3-3884-4942-9d26-80266ece174e" containerID="66683edcde89f39248e1635057f00dfe78d4f0865e982c77ee8557cc8453c7e6" exitCode=0 Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.251355 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5769r" event={"ID":"38a53cf3-3884-4942-9d26-80266ece174e","Type":"ContainerDied","Data":"66683edcde89f39248e1635057f00dfe78d4f0865e982c77ee8557cc8453c7e6"} Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.251382 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5769r" event={"ID":"38a53cf3-3884-4942-9d26-80266ece174e","Type":"ContainerDied","Data":"0c20e29a88f6e41425f3b4f681be601152ef3559981768b9802777c859877393"} Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.251402 4764 scope.go:117] "RemoveContainer" containerID="66683edcde89f39248e1635057f00dfe78d4f0865e982c77ee8557cc8453c7e6" Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.251549 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5769r" Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.285974 4764 scope.go:117] "RemoveContainer" containerID="d11cdfab05337aed235015f68ff83a39a765d33bcb95c6d55645066eecffaff2" Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.286894 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5769r"] Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.298079 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5769r"] Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.311274 4764 scope.go:117] "RemoveContainer" containerID="c1aa8fb12965ed92ebbb8915d7005d0e1977f885952eb602864245e4b359781d" Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.377620 4764 scope.go:117] "RemoveContainer" containerID="66683edcde89f39248e1635057f00dfe78d4f0865e982c77ee8557cc8453c7e6" Oct 01 16:36:36 crc kubenswrapper[4764]: E1001 16:36:36.378347 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66683edcde89f39248e1635057f00dfe78d4f0865e982c77ee8557cc8453c7e6\": container with ID starting with 66683edcde89f39248e1635057f00dfe78d4f0865e982c77ee8557cc8453c7e6 not found: ID does not exist" containerID="66683edcde89f39248e1635057f00dfe78d4f0865e982c77ee8557cc8453c7e6" Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.378380 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66683edcde89f39248e1635057f00dfe78d4f0865e982c77ee8557cc8453c7e6"} err="failed to get container status \"66683edcde89f39248e1635057f00dfe78d4f0865e982c77ee8557cc8453c7e6\": rpc error: code = NotFound desc = could not find container \"66683edcde89f39248e1635057f00dfe78d4f0865e982c77ee8557cc8453c7e6\": container with ID starting with 66683edcde89f39248e1635057f00dfe78d4f0865e982c77ee8557cc8453c7e6 not found: ID does not exist" Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.378406 4764 scope.go:117] "RemoveContainer" containerID="d11cdfab05337aed235015f68ff83a39a765d33bcb95c6d55645066eecffaff2" Oct 01 16:36:36 crc kubenswrapper[4764]: E1001 16:36:36.378815 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d11cdfab05337aed235015f68ff83a39a765d33bcb95c6d55645066eecffaff2\": container with ID starting with d11cdfab05337aed235015f68ff83a39a765d33bcb95c6d55645066eecffaff2 not found: ID does not exist" containerID="d11cdfab05337aed235015f68ff83a39a765d33bcb95c6d55645066eecffaff2" Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.378837 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d11cdfab05337aed235015f68ff83a39a765d33bcb95c6d55645066eecffaff2"} err="failed to get container status \"d11cdfab05337aed235015f68ff83a39a765d33bcb95c6d55645066eecffaff2\": rpc error: code = NotFound desc = could not find container \"d11cdfab05337aed235015f68ff83a39a765d33bcb95c6d55645066eecffaff2\": container with ID starting with d11cdfab05337aed235015f68ff83a39a765d33bcb95c6d55645066eecffaff2 not found: ID does not exist" Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.378856 4764 scope.go:117] "RemoveContainer" containerID="c1aa8fb12965ed92ebbb8915d7005d0e1977f885952eb602864245e4b359781d" Oct 01 16:36:36 crc kubenswrapper[4764]: E1001 16:36:36.379408 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1aa8fb12965ed92ebbb8915d7005d0e1977f885952eb602864245e4b359781d\": container with ID starting with c1aa8fb12965ed92ebbb8915d7005d0e1977f885952eb602864245e4b359781d not found: ID does not exist" containerID="c1aa8fb12965ed92ebbb8915d7005d0e1977f885952eb602864245e4b359781d" Oct 01 16:36:36 crc kubenswrapper[4764]: I1001 16:36:36.379430 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1aa8fb12965ed92ebbb8915d7005d0e1977f885952eb602864245e4b359781d"} err="failed to get container status \"c1aa8fb12965ed92ebbb8915d7005d0e1977f885952eb602864245e4b359781d\": rpc error: code = NotFound desc = could not find container \"c1aa8fb12965ed92ebbb8915d7005d0e1977f885952eb602864245e4b359781d\": container with ID starting with c1aa8fb12965ed92ebbb8915d7005d0e1977f885952eb602864245e4b359781d not found: ID does not exist" Oct 01 16:36:37 crc kubenswrapper[4764]: I1001 16:36:37.733733 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a53cf3-3884-4942-9d26-80266ece174e" path="/var/lib/kubelet/pods/38a53cf3-3884-4942-9d26-80266ece174e/volumes" Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.808920 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-tr4bk"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.817596 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-tr4bk"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.826696 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.838206 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.846761 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.853166 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.858299 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.869008 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.869073 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-v56ft"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.877754 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.888321 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.901169 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.909751 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bvcsl"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.915785 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-mnrmz"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.921522 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.927532 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bjrw8"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.933538 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-vjx8j"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.939475 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-8stpv"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.945391 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-55rbc"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.951294 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dd5gj"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.957077 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kdntm"] Oct 01 16:37:00 crc kubenswrapper[4764]: I1001 16:37:00.963090 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-n5h2g"] Oct 01 16:37:01 crc kubenswrapper[4764]: I1001 16:37:01.736689 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1612b417-4f29-4c05-ab4b-c3d7f193a17c" path="/var/lib/kubelet/pods/1612b417-4f29-4c05-ab4b-c3d7f193a17c/volumes" Oct 01 16:37:01 crc kubenswrapper[4764]: I1001 16:37:01.737905 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd" path="/var/lib/kubelet/pods/1ce27b0c-a46c-4c58-bd63-4edcf6c07ecd/volumes" Oct 01 16:37:01 crc kubenswrapper[4764]: I1001 16:37:01.739280 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63959548-6ede-4c97-9c70-aeebb4cbff8b" path="/var/lib/kubelet/pods/63959548-6ede-4c97-9c70-aeebb4cbff8b/volumes" Oct 01 16:37:01 crc kubenswrapper[4764]: I1001 16:37:01.741543 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66071483-0a25-4d14-afea-3f08fe54ddc5" path="/var/lib/kubelet/pods/66071483-0a25-4d14-afea-3f08fe54ddc5/volumes" Oct 01 16:37:01 crc kubenswrapper[4764]: I1001 16:37:01.743192 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b3a4323-b2aa-4b79-a155-6485ec72fe1f" path="/var/lib/kubelet/pods/7b3a4323-b2aa-4b79-a155-6485ec72fe1f/volumes" Oct 01 16:37:01 crc kubenswrapper[4764]: I1001 16:37:01.743810 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c29699-15b0-4a46-b9f6-afb8330b459d" path="/var/lib/kubelet/pods/86c29699-15b0-4a46-b9f6-afb8330b459d/volumes" Oct 01 16:37:01 crc kubenswrapper[4764]: I1001 16:37:01.744443 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95442ee6-0604-4c09-b597-a2c7d1f19985" path="/var/lib/kubelet/pods/95442ee6-0604-4c09-b597-a2c7d1f19985/volumes" Oct 01 16:37:01 crc kubenswrapper[4764]: I1001 16:37:01.745133 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b541d9-8d8f-4dfb-9c59-f00ba871257c" path="/var/lib/kubelet/pods/a4b541d9-8d8f-4dfb-9c59-f00ba871257c/volumes" Oct 01 16:37:01 crc kubenswrapper[4764]: I1001 16:37:01.745907 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb0612b-19c6-49e5-992d-d7fb9869eeeb" path="/var/lib/kubelet/pods/bcb0612b-19c6-49e5-992d-d7fb9869eeeb/volumes" Oct 01 16:37:01 crc kubenswrapper[4764]: I1001 16:37:01.746744 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e46429d1-3fd7-40af-8e14-05b775bc1197" path="/var/lib/kubelet/pods/e46429d1-3fd7-40af-8e14-05b775bc1197/volumes" Oct 01 16:37:01 crc kubenswrapper[4764]: I1001 16:37:01.748167 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f803c1c5-bacb-4bb4-8c19-813f8c012625" path="/var/lib/kubelet/pods/f803c1c5-bacb-4bb4-8c19-813f8c012625/volumes" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.715936 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf"] Oct 01 16:37:06 crc kubenswrapper[4764]: E1001 16:37:06.716676 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a53cf3-3884-4942-9d26-80266ece174e" containerName="registry-server" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.716687 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a53cf3-3884-4942-9d26-80266ece174e" containerName="registry-server" Oct 01 16:37:06 crc kubenswrapper[4764]: E1001 16:37:06.716716 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a53cf3-3884-4942-9d26-80266ece174e" containerName="extract-utilities" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.716722 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a53cf3-3884-4942-9d26-80266ece174e" containerName="extract-utilities" Oct 01 16:37:06 crc kubenswrapper[4764]: E1001 16:37:06.716743 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a53cf3-3884-4942-9d26-80266ece174e" containerName="extract-content" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.716749 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a53cf3-3884-4942-9d26-80266ece174e" containerName="extract-content" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.716919 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a53cf3-3884-4942-9d26-80266ece174e" containerName="registry-server" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.717500 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.720020 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.720449 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.721247 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.721755 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.725917 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.730583 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf"] Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.847327 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.847447 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.847486 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2r7g\" (UniqueName: \"kubernetes.io/projected/839c68a1-2404-4037-8975-58e6b02ba81f-kube-api-access-g2r7g\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.847538 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.848258 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.949991 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.950212 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2r7g\" (UniqueName: \"kubernetes.io/projected/839c68a1-2404-4037-8975-58e6b02ba81f-kube-api-access-g2r7g\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.950281 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.950386 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:06 crc kubenswrapper[4764]: I1001 16:37:06.950481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:07 crc kubenswrapper[4764]: I1001 16:37:07.030964 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:07 crc kubenswrapper[4764]: I1001 16:37:07.031209 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2r7g\" (UniqueName: \"kubernetes.io/projected/839c68a1-2404-4037-8975-58e6b02ba81f-kube-api-access-g2r7g\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:07 crc kubenswrapper[4764]: I1001 16:37:07.031896 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:07 crc kubenswrapper[4764]: I1001 16:37:07.034543 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:07 crc kubenswrapper[4764]: I1001 16:37:07.047550 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:07 crc kubenswrapper[4764]: I1001 16:37:07.260443 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:07 crc kubenswrapper[4764]: I1001 16:37:07.817982 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf"] Oct 01 16:37:08 crc kubenswrapper[4764]: I1001 16:37:08.576778 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" event={"ID":"839c68a1-2404-4037-8975-58e6b02ba81f","Type":"ContainerStarted","Data":"afe05fa2ceeee5c96c9c2e30d919cadf739fe6d22553eec2e0fa85b4003bcd49"} Oct 01 16:37:09 crc kubenswrapper[4764]: I1001 16:37:09.589705 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" event={"ID":"839c68a1-2404-4037-8975-58e6b02ba81f","Type":"ContainerStarted","Data":"fe62fb8f0f67b7e6e9f2c7a5e91332cb28aeef7e3b113f68b33ec990a41da33c"} Oct 01 16:37:09 crc kubenswrapper[4764]: I1001 16:37:09.618001 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" podStartSLOduration=2.963237157 podStartE2EDuration="3.617985315s" podCreationTimestamp="2025-10-01 16:37:06 +0000 UTC" firstStartedPulling="2025-10-01 16:37:07.839936282 +0000 UTC m=+2090.839583117" lastFinishedPulling="2025-10-01 16:37:08.4946844 +0000 UTC m=+2091.494331275" observedRunningTime="2025-10-01 16:37:09.614676483 +0000 UTC m=+2092.614323318" watchObservedRunningTime="2025-10-01 16:37:09.617985315 +0000 UTC m=+2092.617632150" Oct 01 16:37:20 crc kubenswrapper[4764]: I1001 16:37:20.692292 4764 generic.go:334] "Generic (PLEG): container finished" podID="839c68a1-2404-4037-8975-58e6b02ba81f" containerID="fe62fb8f0f67b7e6e9f2c7a5e91332cb28aeef7e3b113f68b33ec990a41da33c" exitCode=0 Oct 01 16:37:20 crc kubenswrapper[4764]: I1001 16:37:20.692371 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" event={"ID":"839c68a1-2404-4037-8975-58e6b02ba81f","Type":"ContainerDied","Data":"fe62fb8f0f67b7e6e9f2c7a5e91332cb28aeef7e3b113f68b33ec990a41da33c"} Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.124138 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.241891 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-ssh-key\") pod \"839c68a1-2404-4037-8975-58e6b02ba81f\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.242007 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-ceph\") pod \"839c68a1-2404-4037-8975-58e6b02ba81f\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.242105 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-inventory\") pod \"839c68a1-2404-4037-8975-58e6b02ba81f\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.242207 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-repo-setup-combined-ca-bundle\") pod \"839c68a1-2404-4037-8975-58e6b02ba81f\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.242276 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2r7g\" (UniqueName: \"kubernetes.io/projected/839c68a1-2404-4037-8975-58e6b02ba81f-kube-api-access-g2r7g\") pod \"839c68a1-2404-4037-8975-58e6b02ba81f\" (UID: \"839c68a1-2404-4037-8975-58e6b02ba81f\") " Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.248657 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-ceph" (OuterVolumeSpecName: "ceph") pod "839c68a1-2404-4037-8975-58e6b02ba81f" (UID: "839c68a1-2404-4037-8975-58e6b02ba81f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.248799 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/839c68a1-2404-4037-8975-58e6b02ba81f-kube-api-access-g2r7g" (OuterVolumeSpecName: "kube-api-access-g2r7g") pod "839c68a1-2404-4037-8975-58e6b02ba81f" (UID: "839c68a1-2404-4037-8975-58e6b02ba81f"). InnerVolumeSpecName "kube-api-access-g2r7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.252748 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "839c68a1-2404-4037-8975-58e6b02ba81f" (UID: "839c68a1-2404-4037-8975-58e6b02ba81f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.284996 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-inventory" (OuterVolumeSpecName: "inventory") pod "839c68a1-2404-4037-8975-58e6b02ba81f" (UID: "839c68a1-2404-4037-8975-58e6b02ba81f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.290788 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "839c68a1-2404-4037-8975-58e6b02ba81f" (UID: "839c68a1-2404-4037-8975-58e6b02ba81f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.345569 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.345625 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.345644 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.345663 4764 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839c68a1-2404-4037-8975-58e6b02ba81f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.345684 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2r7g\" (UniqueName: \"kubernetes.io/projected/839c68a1-2404-4037-8975-58e6b02ba81f-kube-api-access-g2r7g\") on node \"crc\" DevicePath \"\"" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.712919 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" event={"ID":"839c68a1-2404-4037-8975-58e6b02ba81f","Type":"ContainerDied","Data":"afe05fa2ceeee5c96c9c2e30d919cadf739fe6d22553eec2e0fa85b4003bcd49"} Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.712959 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afe05fa2ceeee5c96c9c2e30d919cadf739fe6d22553eec2e0fa85b4003bcd49" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.712976 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.780778 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m"] Oct 01 16:37:22 crc kubenswrapper[4764]: E1001 16:37:22.781382 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839c68a1-2404-4037-8975-58e6b02ba81f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.781408 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="839c68a1-2404-4037-8975-58e6b02ba81f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.781654 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="839c68a1-2404-4037-8975-58e6b02ba81f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.782598 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.785512 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.785779 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.785961 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.787188 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.787464 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.794262 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m"] Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.854664 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.854748 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.854916 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.854989 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.855170 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6bkf\" (UniqueName: \"kubernetes.io/projected/abe7d369-08b8-431b-9b66-3b6056a37e00-kube-api-access-b6bkf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.957158 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6bkf\" (UniqueName: \"kubernetes.io/projected/abe7d369-08b8-431b-9b66-3b6056a37e00-kube-api-access-b6bkf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.957279 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.957311 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.957409 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.957489 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.962348 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.962431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.963041 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.963424 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:22 crc kubenswrapper[4764]: I1001 16:37:22.974040 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6bkf\" (UniqueName: \"kubernetes.io/projected/abe7d369-08b8-431b-9b66-3b6056a37e00-kube-api-access-b6bkf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:23 crc kubenswrapper[4764]: I1001 16:37:23.098254 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:37:23 crc kubenswrapper[4764]: I1001 16:37:23.604558 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m"] Oct 01 16:37:23 crc kubenswrapper[4764]: I1001 16:37:23.732251 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" event={"ID":"abe7d369-08b8-431b-9b66-3b6056a37e00","Type":"ContainerStarted","Data":"d19b72e16cfbc21521b39820912a9188c1bbbef1568cc51f5cd4dbf38cb60d52"} Oct 01 16:37:25 crc kubenswrapper[4764]: I1001 16:37:25.755326 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" event={"ID":"abe7d369-08b8-431b-9b66-3b6056a37e00","Type":"ContainerStarted","Data":"941702f24bc170a7dc8eb6d6d71ac585c6cd3d934af4fab1a27e9d0f6362b9f3"} Oct 01 16:37:25 crc kubenswrapper[4764]: I1001 16:37:25.781149 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" podStartSLOduration=2.9892873460000002 podStartE2EDuration="3.781128081s" podCreationTimestamp="2025-10-01 16:37:22 +0000 UTC" firstStartedPulling="2025-10-01 16:37:23.611476733 +0000 UTC m=+2106.611123568" lastFinishedPulling="2025-10-01 16:37:24.403317438 +0000 UTC m=+2107.402964303" observedRunningTime="2025-10-01 16:37:25.777530164 +0000 UTC m=+2108.777177009" watchObservedRunningTime="2025-10-01 16:37:25.781128081 +0000 UTC m=+2108.780774926" Oct 01 16:37:47 crc kubenswrapper[4764]: I1001 16:37:47.022830 4764 scope.go:117] "RemoveContainer" containerID="272fa0ab9e95ed0ca13f416d177c42f9576a06ffd3a3c0e369f0b1f5b959bc1b" Oct 01 16:37:47 crc kubenswrapper[4764]: I1001 16:37:47.106737 4764 scope.go:117] "RemoveContainer" containerID="cb90743ae2f3ca72f211a1043c24b28326f767dde457945e57513182d60721d0" Oct 01 16:37:47 crc kubenswrapper[4764]: I1001 16:37:47.180753 4764 scope.go:117] "RemoveContainer" containerID="1cc8b29e1f2180ae404f33f65f3533100368a78feb0edd58fd47d8b4a4f029fb" Oct 01 16:37:47 crc kubenswrapper[4764]: I1001 16:37:47.224758 4764 scope.go:117] "RemoveContainer" containerID="c159541cbac17aabb8fbad292de301d773f006ebd5b80c34af1c738521b69770" Oct 01 16:37:47 crc kubenswrapper[4764]: I1001 16:37:47.253852 4764 scope.go:117] "RemoveContainer" containerID="8a5921afb3a3713d040a30cb566f6eb2d40f370557ec018a3b780d0ffde91301" Oct 01 16:37:47 crc kubenswrapper[4764]: I1001 16:37:47.306999 4764 scope.go:117] "RemoveContainer" containerID="d73bb60f2e6eda5d30dc3cf612c93648ffc730a42ce6f351193af2502b409bd5" Oct 01 16:37:47 crc kubenswrapper[4764]: I1001 16:37:47.353987 4764 scope.go:117] "RemoveContainer" containerID="ee0a137efebff1555547d5a9673ffb7a7bb9d7af3f4c76c80829a25c9fe24fc0" Oct 01 16:37:47 crc kubenswrapper[4764]: I1001 16:37:47.407193 4764 scope.go:117] "RemoveContainer" containerID="82a1cec33b73cf1a610c0c98aa69ac0cee97502e8599323923b3d94275087e8a" Oct 01 16:37:47 crc kubenswrapper[4764]: I1001 16:37:47.441556 4764 scope.go:117] "RemoveContainer" containerID="05522678f7b5b2f2596d8d9a274d086645f451b8d9a21ba60655e5f0ddcb051d" Oct 01 16:37:51 crc kubenswrapper[4764]: I1001 16:37:51.914711 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:37:51 crc kubenswrapper[4764]: I1001 16:37:51.918108 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:38:21 crc kubenswrapper[4764]: I1001 16:38:21.913543 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:38:21 crc kubenswrapper[4764]: I1001 16:38:21.914578 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:38:47 crc kubenswrapper[4764]: I1001 16:38:47.659531 4764 scope.go:117] "RemoveContainer" containerID="40c706c7ecd0f3dcd1a801fd0481d9e4ee2bdb09b983e3f68f19611b1d3bbb78" Oct 01 16:38:47 crc kubenswrapper[4764]: I1001 16:38:47.726135 4764 scope.go:117] "RemoveContainer" containerID="6df8c1aae9178e046c4a55faacd040477294c548d5feb7d84f245648218a3866" Oct 01 16:38:51 crc kubenswrapper[4764]: I1001 16:38:51.914236 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:38:51 crc kubenswrapper[4764]: I1001 16:38:51.914744 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:38:51 crc kubenswrapper[4764]: I1001 16:38:51.914819 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:38:51 crc kubenswrapper[4764]: I1001 16:38:51.916010 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0b778d1376a46a6d5f3e2b028c1b5a6ac86778ba307f59878807e5928f0d1b7"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:38:51 crc kubenswrapper[4764]: I1001 16:38:51.916170 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://d0b778d1376a46a6d5f3e2b028c1b5a6ac86778ba307f59878807e5928f0d1b7" gracePeriod=600 Oct 01 16:38:52 crc kubenswrapper[4764]: I1001 16:38:52.633817 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="d0b778d1376a46a6d5f3e2b028c1b5a6ac86778ba307f59878807e5928f0d1b7" exitCode=0 Oct 01 16:38:52 crc kubenswrapper[4764]: I1001 16:38:52.633886 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"d0b778d1376a46a6d5f3e2b028c1b5a6ac86778ba307f59878807e5928f0d1b7"} Oct 01 16:38:52 crc kubenswrapper[4764]: I1001 16:38:52.634303 4764 scope.go:117] "RemoveContainer" containerID="735730b6a11fcac0fc45a7ad5ec201d3ebe1dc3ce4a7cbcbf02cb02b1a512176" Oct 01 16:38:53 crc kubenswrapper[4764]: I1001 16:38:53.644437 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb"} Oct 01 16:39:00 crc kubenswrapper[4764]: I1001 16:39:00.722091 4764 generic.go:334] "Generic (PLEG): container finished" podID="abe7d369-08b8-431b-9b66-3b6056a37e00" containerID="941702f24bc170a7dc8eb6d6d71ac585c6cd3d934af4fab1a27e9d0f6362b9f3" exitCode=0 Oct 01 16:39:00 crc kubenswrapper[4764]: I1001 16:39:00.722692 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" event={"ID":"abe7d369-08b8-431b-9b66-3b6056a37e00","Type":"ContainerDied","Data":"941702f24bc170a7dc8eb6d6d71ac585c6cd3d934af4fab1a27e9d0f6362b9f3"} Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.234816 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.395385 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-ssh-key\") pod \"abe7d369-08b8-431b-9b66-3b6056a37e00\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.395432 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-bootstrap-combined-ca-bundle\") pod \"abe7d369-08b8-431b-9b66-3b6056a37e00\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.395631 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-ceph\") pod \"abe7d369-08b8-431b-9b66-3b6056a37e00\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.395671 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6bkf\" (UniqueName: \"kubernetes.io/projected/abe7d369-08b8-431b-9b66-3b6056a37e00-kube-api-access-b6bkf\") pod \"abe7d369-08b8-431b-9b66-3b6056a37e00\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.395702 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-inventory\") pod \"abe7d369-08b8-431b-9b66-3b6056a37e00\" (UID: \"abe7d369-08b8-431b-9b66-3b6056a37e00\") " Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.401258 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-ceph" (OuterVolumeSpecName: "ceph") pod "abe7d369-08b8-431b-9b66-3b6056a37e00" (UID: "abe7d369-08b8-431b-9b66-3b6056a37e00"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.403845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe7d369-08b8-431b-9b66-3b6056a37e00-kube-api-access-b6bkf" (OuterVolumeSpecName: "kube-api-access-b6bkf") pod "abe7d369-08b8-431b-9b66-3b6056a37e00" (UID: "abe7d369-08b8-431b-9b66-3b6056a37e00"). InnerVolumeSpecName "kube-api-access-b6bkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.404365 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "abe7d369-08b8-431b-9b66-3b6056a37e00" (UID: "abe7d369-08b8-431b-9b66-3b6056a37e00"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.421823 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-inventory" (OuterVolumeSpecName: "inventory") pod "abe7d369-08b8-431b-9b66-3b6056a37e00" (UID: "abe7d369-08b8-431b-9b66-3b6056a37e00"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.454430 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "abe7d369-08b8-431b-9b66-3b6056a37e00" (UID: "abe7d369-08b8-431b-9b66-3b6056a37e00"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.498134 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6bkf\" (UniqueName: \"kubernetes.io/projected/abe7d369-08b8-431b-9b66-3b6056a37e00-kube-api-access-b6bkf\") on node \"crc\" DevicePath \"\"" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.498500 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.498598 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.498787 4764 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.498879 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/abe7d369-08b8-431b-9b66-3b6056a37e00-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.762622 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" event={"ID":"abe7d369-08b8-431b-9b66-3b6056a37e00","Type":"ContainerDied","Data":"d19b72e16cfbc21521b39820912a9188c1bbbef1568cc51f5cd4dbf38cb60d52"} Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.763468 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d19b72e16cfbc21521b39820912a9188c1bbbef1568cc51f5cd4dbf38cb60d52" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.762901 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.866288 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf"] Oct 01 16:39:02 crc kubenswrapper[4764]: E1001 16:39:02.867401 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abe7d369-08b8-431b-9b66-3b6056a37e00" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.867446 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe7d369-08b8-431b-9b66-3b6056a37e00" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.867745 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="abe7d369-08b8-431b-9b66-3b6056a37e00" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.868576 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.873598 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.873598 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.873655 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.874476 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.874811 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:39:02 crc kubenswrapper[4764]: I1001 16:39:02.887272 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf"] Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.012744 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggdgz\" (UniqueName: \"kubernetes.io/projected/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-kube-api-access-ggdgz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.012800 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.012845 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.012944 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.114920 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.115019 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggdgz\" (UniqueName: \"kubernetes.io/projected/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-kube-api-access-ggdgz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.115084 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.115142 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.120485 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.130198 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.130369 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.135719 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggdgz\" (UniqueName: \"kubernetes.io/projected/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-kube-api-access-ggdgz\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.219026 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:03 crc kubenswrapper[4764]: I1001 16:39:03.836132 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf"] Oct 01 16:39:03 crc kubenswrapper[4764]: W1001 16:39:03.843813 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f55b6f9_370f_489f_9bfd_989fbc5cd8b9.slice/crio-d72e19cc5593298bafbe2ca5d328379ce9c831437464db5ad2ae85fb088c1244 WatchSource:0}: Error finding container d72e19cc5593298bafbe2ca5d328379ce9c831437464db5ad2ae85fb088c1244: Status 404 returned error can't find the container with id d72e19cc5593298bafbe2ca5d328379ce9c831437464db5ad2ae85fb088c1244 Oct 01 16:39:04 crc kubenswrapper[4764]: I1001 16:39:04.790944 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" event={"ID":"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9","Type":"ContainerStarted","Data":"d72e19cc5593298bafbe2ca5d328379ce9c831437464db5ad2ae85fb088c1244"} Oct 01 16:39:05 crc kubenswrapper[4764]: I1001 16:39:05.805019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" event={"ID":"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9","Type":"ContainerStarted","Data":"07068fbb88c4fad8a0941c6bee666878f4bf2cdeeda036d3a48b1308d259e97a"} Oct 01 16:39:05 crc kubenswrapper[4764]: I1001 16:39:05.835171 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" podStartSLOduration=3.089500322 podStartE2EDuration="3.835142382s" podCreationTimestamp="2025-10-01 16:39:02 +0000 UTC" firstStartedPulling="2025-10-01 16:39:03.847647097 +0000 UTC m=+2206.847293942" lastFinishedPulling="2025-10-01 16:39:04.593289167 +0000 UTC m=+2207.592936002" observedRunningTime="2025-10-01 16:39:05.826748246 +0000 UTC m=+2208.826395111" watchObservedRunningTime="2025-10-01 16:39:05.835142382 +0000 UTC m=+2208.834789247" Oct 01 16:39:32 crc kubenswrapper[4764]: I1001 16:39:32.080210 4764 generic.go:334] "Generic (PLEG): container finished" podID="4f55b6f9-370f-489f-9bfd-989fbc5cd8b9" containerID="07068fbb88c4fad8a0941c6bee666878f4bf2cdeeda036d3a48b1308d259e97a" exitCode=0 Oct 01 16:39:32 crc kubenswrapper[4764]: I1001 16:39:32.080460 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" event={"ID":"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9","Type":"ContainerDied","Data":"07068fbb88c4fad8a0941c6bee666878f4bf2cdeeda036d3a48b1308d259e97a"} Oct 01 16:39:33 crc kubenswrapper[4764]: I1001 16:39:33.569608 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:33 crc kubenswrapper[4764]: I1001 16:39:33.758606 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-ssh-key\") pod \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " Oct 01 16:39:33 crc kubenswrapper[4764]: I1001 16:39:33.758834 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-inventory\") pod \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " Oct 01 16:39:33 crc kubenswrapper[4764]: I1001 16:39:33.758883 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggdgz\" (UniqueName: \"kubernetes.io/projected/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-kube-api-access-ggdgz\") pod \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " Oct 01 16:39:33 crc kubenswrapper[4764]: I1001 16:39:33.758963 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-ceph\") pod \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\" (UID: \"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9\") " Oct 01 16:39:33 crc kubenswrapper[4764]: I1001 16:39:33.765971 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-ceph" (OuterVolumeSpecName: "ceph") pod "4f55b6f9-370f-489f-9bfd-989fbc5cd8b9" (UID: "4f55b6f9-370f-489f-9bfd-989fbc5cd8b9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:39:33 crc kubenswrapper[4764]: I1001 16:39:33.768791 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-kube-api-access-ggdgz" (OuterVolumeSpecName: "kube-api-access-ggdgz") pod "4f55b6f9-370f-489f-9bfd-989fbc5cd8b9" (UID: "4f55b6f9-370f-489f-9bfd-989fbc5cd8b9"). InnerVolumeSpecName "kube-api-access-ggdgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:39:33 crc kubenswrapper[4764]: I1001 16:39:33.796249 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4f55b6f9-370f-489f-9bfd-989fbc5cd8b9" (UID: "4f55b6f9-370f-489f-9bfd-989fbc5cd8b9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:39:33 crc kubenswrapper[4764]: I1001 16:39:33.806448 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-inventory" (OuterVolumeSpecName: "inventory") pod "4f55b6f9-370f-489f-9bfd-989fbc5cd8b9" (UID: "4f55b6f9-370f-489f-9bfd-989fbc5cd8b9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:39:33 crc kubenswrapper[4764]: I1001 16:39:33.860989 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:39:33 crc kubenswrapper[4764]: I1001 16:39:33.861646 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggdgz\" (UniqueName: \"kubernetes.io/projected/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-kube-api-access-ggdgz\") on node \"crc\" DevicePath \"\"" Oct 01 16:39:33 crc kubenswrapper[4764]: I1001 16:39:33.861660 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:39:33 crc kubenswrapper[4764]: I1001 16:39:33.861670 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f55b6f9-370f-489f-9bfd-989fbc5cd8b9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.109277 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" event={"ID":"4f55b6f9-370f-489f-9bfd-989fbc5cd8b9","Type":"ContainerDied","Data":"d72e19cc5593298bafbe2ca5d328379ce9c831437464db5ad2ae85fb088c1244"} Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.109329 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d72e19cc5593298bafbe2ca5d328379ce9c831437464db5ad2ae85fb088c1244" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.109385 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.207597 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt"] Oct 01 16:39:34 crc kubenswrapper[4764]: E1001 16:39:34.208090 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f55b6f9-370f-489f-9bfd-989fbc5cd8b9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.208111 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f55b6f9-370f-489f-9bfd-989fbc5cd8b9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.208336 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f55b6f9-370f-489f-9bfd-989fbc5cd8b9" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.209016 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.213470 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.214009 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.214289 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.214443 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.214846 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.220621 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt"] Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.375181 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbbcf\" (UniqueName: \"kubernetes.io/projected/0134afb9-9d23-47e6-9d46-6a025c3a3a57-kube-api-access-lbbcf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.375305 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.376196 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.376445 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.477898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.477987 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.478088 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.478161 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbbcf\" (UniqueName: \"kubernetes.io/projected/0134afb9-9d23-47e6-9d46-6a025c3a3a57-kube-api-access-lbbcf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.486182 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.486482 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.490481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.502942 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbbcf\" (UniqueName: \"kubernetes.io/projected/0134afb9-9d23-47e6-9d46-6a025c3a3a57-kube-api-access-lbbcf\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:34 crc kubenswrapper[4764]: I1001 16:39:34.536117 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:35 crc kubenswrapper[4764]: I1001 16:39:35.117668 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt"] Oct 01 16:39:36 crc kubenswrapper[4764]: I1001 16:39:36.129356 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" event={"ID":"0134afb9-9d23-47e6-9d46-6a025c3a3a57","Type":"ContainerStarted","Data":"dd8423dc6063dc619529a11d5349f1dce98862c57d4272228756d1c61b914821"} Oct 01 16:39:36 crc kubenswrapper[4764]: I1001 16:39:36.129753 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" event={"ID":"0134afb9-9d23-47e6-9d46-6a025c3a3a57","Type":"ContainerStarted","Data":"18b8d181363aeb2d8f1f9244b4cacde02908634a16442a3be4d971df078db6f2"} Oct 01 16:39:36 crc kubenswrapper[4764]: I1001 16:39:36.150461 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" podStartSLOduration=1.548084515 podStartE2EDuration="2.150443596s" podCreationTimestamp="2025-10-01 16:39:34 +0000 UTC" firstStartedPulling="2025-10-01 16:39:35.131436063 +0000 UTC m=+2238.131082898" lastFinishedPulling="2025-10-01 16:39:35.733795144 +0000 UTC m=+2238.733441979" observedRunningTime="2025-10-01 16:39:36.147992105 +0000 UTC m=+2239.147638930" watchObservedRunningTime="2025-10-01 16:39:36.150443596 +0000 UTC m=+2239.150090431" Oct 01 16:39:41 crc kubenswrapper[4764]: I1001 16:39:41.183863 4764 generic.go:334] "Generic (PLEG): container finished" podID="0134afb9-9d23-47e6-9d46-6a025c3a3a57" containerID="dd8423dc6063dc619529a11d5349f1dce98862c57d4272228756d1c61b914821" exitCode=0 Oct 01 16:39:41 crc kubenswrapper[4764]: I1001 16:39:41.183988 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" event={"ID":"0134afb9-9d23-47e6-9d46-6a025c3a3a57","Type":"ContainerDied","Data":"dd8423dc6063dc619529a11d5349f1dce98862c57d4272228756d1c61b914821"} Oct 01 16:39:42 crc kubenswrapper[4764]: I1001 16:39:42.627032 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:42 crc kubenswrapper[4764]: I1001 16:39:42.745826 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-inventory\") pod \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " Oct 01 16:39:42 crc kubenswrapper[4764]: I1001 16:39:42.746188 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-ceph\") pod \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " Oct 01 16:39:42 crc kubenswrapper[4764]: I1001 16:39:42.746390 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-ssh-key\") pod \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " Oct 01 16:39:42 crc kubenswrapper[4764]: I1001 16:39:42.746457 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbbcf\" (UniqueName: \"kubernetes.io/projected/0134afb9-9d23-47e6-9d46-6a025c3a3a57-kube-api-access-lbbcf\") pod \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\" (UID: \"0134afb9-9d23-47e6-9d46-6a025c3a3a57\") " Oct 01 16:39:42 crc kubenswrapper[4764]: I1001 16:39:42.752236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0134afb9-9d23-47e6-9d46-6a025c3a3a57-kube-api-access-lbbcf" (OuterVolumeSpecName: "kube-api-access-lbbcf") pod "0134afb9-9d23-47e6-9d46-6a025c3a3a57" (UID: "0134afb9-9d23-47e6-9d46-6a025c3a3a57"). InnerVolumeSpecName "kube-api-access-lbbcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:39:42 crc kubenswrapper[4764]: I1001 16:39:42.753878 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-ceph" (OuterVolumeSpecName: "ceph") pod "0134afb9-9d23-47e6-9d46-6a025c3a3a57" (UID: "0134afb9-9d23-47e6-9d46-6a025c3a3a57"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:39:42 crc kubenswrapper[4764]: I1001 16:39:42.773030 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0134afb9-9d23-47e6-9d46-6a025c3a3a57" (UID: "0134afb9-9d23-47e6-9d46-6a025c3a3a57"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:39:42 crc kubenswrapper[4764]: I1001 16:39:42.773110 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-inventory" (OuterVolumeSpecName: "inventory") pod "0134afb9-9d23-47e6-9d46-6a025c3a3a57" (UID: "0134afb9-9d23-47e6-9d46-6a025c3a3a57"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:39:42 crc kubenswrapper[4764]: I1001 16:39:42.854253 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:39:42 crc kubenswrapper[4764]: I1001 16:39:42.854288 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:39:42 crc kubenswrapper[4764]: I1001 16:39:42.854296 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0134afb9-9d23-47e6-9d46-6a025c3a3a57-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:39:42 crc kubenswrapper[4764]: I1001 16:39:42.854306 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbbcf\" (UniqueName: \"kubernetes.io/projected/0134afb9-9d23-47e6-9d46-6a025c3a3a57-kube-api-access-lbbcf\") on node \"crc\" DevicePath \"\"" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.206261 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" event={"ID":"0134afb9-9d23-47e6-9d46-6a025c3a3a57","Type":"ContainerDied","Data":"18b8d181363aeb2d8f1f9244b4cacde02908634a16442a3be4d971df078db6f2"} Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.206589 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18b8d181363aeb2d8f1f9244b4cacde02908634a16442a3be4d971df078db6f2" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.206383 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.310112 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr"] Oct 01 16:39:43 crc kubenswrapper[4764]: E1001 16:39:43.310741 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0134afb9-9d23-47e6-9d46-6a025c3a3a57" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.310767 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0134afb9-9d23-47e6-9d46-6a025c3a3a57" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.311028 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0134afb9-9d23-47e6-9d46-6a025c3a3a57" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.311772 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.316547 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.316687 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.316874 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.317071 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.317453 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.355245 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr"] Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.364617 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wtrjr\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.364783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5l49\" (UniqueName: \"kubernetes.io/projected/a0273bd3-26f6-44d9-a665-75c9eac2cf98-kube-api-access-d5l49\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wtrjr\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.365030 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wtrjr\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.365094 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wtrjr\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.466922 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wtrjr\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.467027 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wtrjr\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.467175 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wtrjr\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.467283 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5l49\" (UniqueName: \"kubernetes.io/projected/a0273bd3-26f6-44d9-a665-75c9eac2cf98-kube-api-access-d5l49\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wtrjr\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.473750 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wtrjr\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.473813 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wtrjr\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.473982 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wtrjr\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.490236 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5l49\" (UniqueName: \"kubernetes.io/projected/a0273bd3-26f6-44d9-a665-75c9eac2cf98-kube-api-access-d5l49\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wtrjr\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:43 crc kubenswrapper[4764]: I1001 16:39:43.632538 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:39:44 crc kubenswrapper[4764]: I1001 16:39:44.205628 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr"] Oct 01 16:39:44 crc kubenswrapper[4764]: W1001 16:39:44.209734 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0273bd3_26f6_44d9_a665_75c9eac2cf98.slice/crio-36664972efa0a4fc9995da2ab68fe9e83cc91a965a8831ebb5a4a9ddac05733a WatchSource:0}: Error finding container 36664972efa0a4fc9995da2ab68fe9e83cc91a965a8831ebb5a4a9ddac05733a: Status 404 returned error can't find the container with id 36664972efa0a4fc9995da2ab68fe9e83cc91a965a8831ebb5a4a9ddac05733a Oct 01 16:39:45 crc kubenswrapper[4764]: I1001 16:39:45.224411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" event={"ID":"a0273bd3-26f6-44d9-a665-75c9eac2cf98","Type":"ContainerStarted","Data":"5e4e968d51e37c933ff7409d2e90a04939897952c595663e1d5beaf5a007d6ff"} Oct 01 16:39:45 crc kubenswrapper[4764]: I1001 16:39:45.224793 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" event={"ID":"a0273bd3-26f6-44d9-a665-75c9eac2cf98","Type":"ContainerStarted","Data":"36664972efa0a4fc9995da2ab68fe9e83cc91a965a8831ebb5a4a9ddac05733a"} Oct 01 16:39:45 crc kubenswrapper[4764]: I1001 16:39:45.243861 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" podStartSLOduration=1.657746594 podStartE2EDuration="2.243835017s" podCreationTimestamp="2025-10-01 16:39:43 +0000 UTC" firstStartedPulling="2025-10-01 16:39:44.215435582 +0000 UTC m=+2247.215082427" lastFinishedPulling="2025-10-01 16:39:44.801524015 +0000 UTC m=+2247.801170850" observedRunningTime="2025-10-01 16:39:45.240390612 +0000 UTC m=+2248.240037457" watchObservedRunningTime="2025-10-01 16:39:45.243835017 +0000 UTC m=+2248.243481872" Oct 01 16:40:10 crc kubenswrapper[4764]: I1001 16:40:10.391530 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zmsl5"] Oct 01 16:40:10 crc kubenswrapper[4764]: I1001 16:40:10.393689 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:10 crc kubenswrapper[4764]: I1001 16:40:10.407026 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zmsl5"] Oct 01 16:40:10 crc kubenswrapper[4764]: I1001 16:40:10.540057 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-catalog-content\") pod \"certified-operators-zmsl5\" (UID: \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\") " pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:10 crc kubenswrapper[4764]: I1001 16:40:10.540420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-utilities\") pod \"certified-operators-zmsl5\" (UID: \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\") " pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:10 crc kubenswrapper[4764]: I1001 16:40:10.540447 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44zzw\" (UniqueName: \"kubernetes.io/projected/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-kube-api-access-44zzw\") pod \"certified-operators-zmsl5\" (UID: \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\") " pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:10 crc kubenswrapper[4764]: I1001 16:40:10.642397 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-catalog-content\") pod \"certified-operators-zmsl5\" (UID: \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\") " pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:10 crc kubenswrapper[4764]: I1001 16:40:10.642853 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-utilities\") pod \"certified-operators-zmsl5\" (UID: \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\") " pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:10 crc kubenswrapper[4764]: I1001 16:40:10.643017 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44zzw\" (UniqueName: \"kubernetes.io/projected/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-kube-api-access-44zzw\") pod \"certified-operators-zmsl5\" (UID: \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\") " pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:10 crc kubenswrapper[4764]: I1001 16:40:10.643469 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-catalog-content\") pod \"certified-operators-zmsl5\" (UID: \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\") " pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:10 crc kubenswrapper[4764]: I1001 16:40:10.643715 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-utilities\") pod \"certified-operators-zmsl5\" (UID: \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\") " pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:10 crc kubenswrapper[4764]: I1001 16:40:10.675994 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44zzw\" (UniqueName: \"kubernetes.io/projected/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-kube-api-access-44zzw\") pod \"certified-operators-zmsl5\" (UID: \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\") " pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:10 crc kubenswrapper[4764]: I1001 16:40:10.730235 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:11 crc kubenswrapper[4764]: I1001 16:40:11.271368 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zmsl5"] Oct 01 16:40:11 crc kubenswrapper[4764]: I1001 16:40:11.466212 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmsl5" event={"ID":"60716b7b-5228-4e39-a1a3-cb7ced4eba4a","Type":"ContainerStarted","Data":"8006268e28738be80fd0ac486932e443c9b22f614b6d59eba408ac7f315c370c"} Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.191323 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z6kht"] Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.193011 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.215349 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6kht"] Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.274237 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqcj4\" (UniqueName: \"kubernetes.io/projected/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-kube-api-access-gqcj4\") pod \"redhat-marketplace-z6kht\" (UID: \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\") " pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.274548 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-utilities\") pod \"redhat-marketplace-z6kht\" (UID: \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\") " pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.274613 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-catalog-content\") pod \"redhat-marketplace-z6kht\" (UID: \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\") " pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.376012 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-utilities\") pod \"redhat-marketplace-z6kht\" (UID: \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\") " pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.376093 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-catalog-content\") pod \"redhat-marketplace-z6kht\" (UID: \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\") " pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.376138 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqcj4\" (UniqueName: \"kubernetes.io/projected/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-kube-api-access-gqcj4\") pod \"redhat-marketplace-z6kht\" (UID: \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\") " pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.376515 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-utilities\") pod \"redhat-marketplace-z6kht\" (UID: \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\") " pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.376686 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-catalog-content\") pod \"redhat-marketplace-z6kht\" (UID: \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\") " pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.397098 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqcj4\" (UniqueName: \"kubernetes.io/projected/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-kube-api-access-gqcj4\") pod \"redhat-marketplace-z6kht\" (UID: \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\") " pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.476458 4764 generic.go:334] "Generic (PLEG): container finished" podID="60716b7b-5228-4e39-a1a3-cb7ced4eba4a" containerID="9b0660bbe93684320cefc7196ebf186800e47f235b097b3aaf154a9a8eedabb6" exitCode=0 Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.476543 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmsl5" event={"ID":"60716b7b-5228-4e39-a1a3-cb7ced4eba4a","Type":"ContainerDied","Data":"9b0660bbe93684320cefc7196ebf186800e47f235b097b3aaf154a9a8eedabb6"} Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.570413 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.792202 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zb9mv"] Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.795547 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.807032 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zb9mv"] Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.985389 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4445f728-62ad-4c70-9fc8-95d674e4680e-utilities\") pod \"community-operators-zb9mv\" (UID: \"4445f728-62ad-4c70-9fc8-95d674e4680e\") " pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.985444 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4445f728-62ad-4c70-9fc8-95d674e4680e-catalog-content\") pod \"community-operators-zb9mv\" (UID: \"4445f728-62ad-4c70-9fc8-95d674e4680e\") " pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.986114 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbm4b\" (UniqueName: \"kubernetes.io/projected/4445f728-62ad-4c70-9fc8-95d674e4680e-kube-api-access-lbm4b\") pod \"community-operators-zb9mv\" (UID: \"4445f728-62ad-4c70-9fc8-95d674e4680e\") " pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:12 crc kubenswrapper[4764]: I1001 16:40:12.994791 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6kht"] Oct 01 16:40:13 crc kubenswrapper[4764]: I1001 16:40:13.087813 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4445f728-62ad-4c70-9fc8-95d674e4680e-utilities\") pod \"community-operators-zb9mv\" (UID: \"4445f728-62ad-4c70-9fc8-95d674e4680e\") " pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:13 crc kubenswrapper[4764]: I1001 16:40:13.088168 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4445f728-62ad-4c70-9fc8-95d674e4680e-catalog-content\") pod \"community-operators-zb9mv\" (UID: \"4445f728-62ad-4c70-9fc8-95d674e4680e\") " pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:13 crc kubenswrapper[4764]: I1001 16:40:13.088215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbm4b\" (UniqueName: \"kubernetes.io/projected/4445f728-62ad-4c70-9fc8-95d674e4680e-kube-api-access-lbm4b\") pod \"community-operators-zb9mv\" (UID: \"4445f728-62ad-4c70-9fc8-95d674e4680e\") " pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:13 crc kubenswrapper[4764]: I1001 16:40:13.088390 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4445f728-62ad-4c70-9fc8-95d674e4680e-utilities\") pod \"community-operators-zb9mv\" (UID: \"4445f728-62ad-4c70-9fc8-95d674e4680e\") " pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:13 crc kubenswrapper[4764]: I1001 16:40:13.088669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4445f728-62ad-4c70-9fc8-95d674e4680e-catalog-content\") pod \"community-operators-zb9mv\" (UID: \"4445f728-62ad-4c70-9fc8-95d674e4680e\") " pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:13 crc kubenswrapper[4764]: I1001 16:40:13.113519 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbm4b\" (UniqueName: \"kubernetes.io/projected/4445f728-62ad-4c70-9fc8-95d674e4680e-kube-api-access-lbm4b\") pod \"community-operators-zb9mv\" (UID: \"4445f728-62ad-4c70-9fc8-95d674e4680e\") " pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:13 crc kubenswrapper[4764]: I1001 16:40:13.122980 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:13 crc kubenswrapper[4764]: I1001 16:40:13.484993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmsl5" event={"ID":"60716b7b-5228-4e39-a1a3-cb7ced4eba4a","Type":"ContainerStarted","Data":"9cfd1fb45dbad8cf32af9db28aab663aac2d38eabdebca3d4adb0d2123bce611"} Oct 01 16:40:13 crc kubenswrapper[4764]: I1001 16:40:13.486557 4764 generic.go:334] "Generic (PLEG): container finished" podID="0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" containerID="3b48a2eaaf2210f3dfe8274640972c7e6dd89e9bd05a0a848389851786ba70b6" exitCode=0 Oct 01 16:40:13 crc kubenswrapper[4764]: I1001 16:40:13.486983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6kht" event={"ID":"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16","Type":"ContainerDied","Data":"3b48a2eaaf2210f3dfe8274640972c7e6dd89e9bd05a0a848389851786ba70b6"} Oct 01 16:40:13 crc kubenswrapper[4764]: I1001 16:40:13.487005 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6kht" event={"ID":"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16","Type":"ContainerStarted","Data":"8984af6cb2f45bb8d59aca70b0cee640f8b246387cd51ca1bd93f905eb595951"} Oct 01 16:40:13 crc kubenswrapper[4764]: I1001 16:40:13.619410 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zb9mv"] Oct 01 16:40:13 crc kubenswrapper[4764]: W1001 16:40:13.624531 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4445f728_62ad_4c70_9fc8_95d674e4680e.slice/crio-e5e61eac0628a8fd376521eb6a07e09282d37979059f3962633028778f74d8a8 WatchSource:0}: Error finding container e5e61eac0628a8fd376521eb6a07e09282d37979059f3962633028778f74d8a8: Status 404 returned error can't find the container with id e5e61eac0628a8fd376521eb6a07e09282d37979059f3962633028778f74d8a8 Oct 01 16:40:14 crc kubenswrapper[4764]: I1001 16:40:14.496772 4764 generic.go:334] "Generic (PLEG): container finished" podID="60716b7b-5228-4e39-a1a3-cb7ced4eba4a" containerID="9cfd1fb45dbad8cf32af9db28aab663aac2d38eabdebca3d4adb0d2123bce611" exitCode=0 Oct 01 16:40:14 crc kubenswrapper[4764]: I1001 16:40:14.496830 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmsl5" event={"ID":"60716b7b-5228-4e39-a1a3-cb7ced4eba4a","Type":"ContainerDied","Data":"9cfd1fb45dbad8cf32af9db28aab663aac2d38eabdebca3d4adb0d2123bce611"} Oct 01 16:40:14 crc kubenswrapper[4764]: I1001 16:40:14.499100 4764 generic.go:334] "Generic (PLEG): container finished" podID="0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" containerID="23f8faf350891768a0d35139bd41849674cf9be8268a987c987376d7947e2f28" exitCode=0 Oct 01 16:40:14 crc kubenswrapper[4764]: I1001 16:40:14.499209 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6kht" event={"ID":"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16","Type":"ContainerDied","Data":"23f8faf350891768a0d35139bd41849674cf9be8268a987c987376d7947e2f28"} Oct 01 16:40:14 crc kubenswrapper[4764]: I1001 16:40:14.503820 4764 generic.go:334] "Generic (PLEG): container finished" podID="4445f728-62ad-4c70-9fc8-95d674e4680e" containerID="ab12295f58fb4c424a36ff9990b49f282947a0428463238922beecc8793e9bb9" exitCode=0 Oct 01 16:40:14 crc kubenswrapper[4764]: I1001 16:40:14.503851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zb9mv" event={"ID":"4445f728-62ad-4c70-9fc8-95d674e4680e","Type":"ContainerDied","Data":"ab12295f58fb4c424a36ff9990b49f282947a0428463238922beecc8793e9bb9"} Oct 01 16:40:14 crc kubenswrapper[4764]: I1001 16:40:14.503877 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zb9mv" event={"ID":"4445f728-62ad-4c70-9fc8-95d674e4680e","Type":"ContainerStarted","Data":"e5e61eac0628a8fd376521eb6a07e09282d37979059f3962633028778f74d8a8"} Oct 01 16:40:15 crc kubenswrapper[4764]: I1001 16:40:15.514779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmsl5" event={"ID":"60716b7b-5228-4e39-a1a3-cb7ced4eba4a","Type":"ContainerStarted","Data":"728f081cd69795a900c9282328271799f62a6b27c4b9766345dbca05ffbc3b4c"} Oct 01 16:40:15 crc kubenswrapper[4764]: I1001 16:40:15.522970 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6kht" event={"ID":"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16","Type":"ContainerStarted","Data":"e51396c5c09fa9d38d44571a18a7a58fe9f33041a588fb339f958d92c89f73e3"} Oct 01 16:40:15 crc kubenswrapper[4764]: I1001 16:40:15.550293 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zmsl5" podStartSLOduration=3.102401061 podStartE2EDuration="5.55027427s" podCreationTimestamp="2025-10-01 16:40:10 +0000 UTC" firstStartedPulling="2025-10-01 16:40:12.478876765 +0000 UTC m=+2275.478523620" lastFinishedPulling="2025-10-01 16:40:14.926749964 +0000 UTC m=+2277.926396829" observedRunningTime="2025-10-01 16:40:15.546593949 +0000 UTC m=+2278.546240804" watchObservedRunningTime="2025-10-01 16:40:15.55027427 +0000 UTC m=+2278.549921105" Oct 01 16:40:16 crc kubenswrapper[4764]: I1001 16:40:16.534241 4764 generic.go:334] "Generic (PLEG): container finished" podID="4445f728-62ad-4c70-9fc8-95d674e4680e" containerID="a7530ce3074a264e2d399b51b68e3f014c534d6785dfb0371ef75f38a94f43ba" exitCode=0 Oct 01 16:40:16 crc kubenswrapper[4764]: I1001 16:40:16.534342 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zb9mv" event={"ID":"4445f728-62ad-4c70-9fc8-95d674e4680e","Type":"ContainerDied","Data":"a7530ce3074a264e2d399b51b68e3f014c534d6785dfb0371ef75f38a94f43ba"} Oct 01 16:40:16 crc kubenswrapper[4764]: I1001 16:40:16.575439 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z6kht" podStartSLOduration=3.171570081 podStartE2EDuration="4.575417147s" podCreationTimestamp="2025-10-01 16:40:12 +0000 UTC" firstStartedPulling="2025-10-01 16:40:13.488645535 +0000 UTC m=+2276.488292370" lastFinishedPulling="2025-10-01 16:40:14.892492571 +0000 UTC m=+2277.892139436" observedRunningTime="2025-10-01 16:40:15.573949582 +0000 UTC m=+2278.573596417" watchObservedRunningTime="2025-10-01 16:40:16.575417147 +0000 UTC m=+2279.575063982" Oct 01 16:40:17 crc kubenswrapper[4764]: I1001 16:40:17.552502 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zb9mv" event={"ID":"4445f728-62ad-4c70-9fc8-95d674e4680e","Type":"ContainerStarted","Data":"57ac2a7eadcfb6866b51e562622e0398b8c650424c7d5bf6dddb632ba939c330"} Oct 01 16:40:17 crc kubenswrapper[4764]: I1001 16:40:17.586003 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zb9mv" podStartSLOduration=2.816867164 podStartE2EDuration="5.585972997s" podCreationTimestamp="2025-10-01 16:40:12 +0000 UTC" firstStartedPulling="2025-10-01 16:40:14.505484149 +0000 UTC m=+2277.505130984" lastFinishedPulling="2025-10-01 16:40:17.274589932 +0000 UTC m=+2280.274236817" observedRunningTime="2025-10-01 16:40:17.57594844 +0000 UTC m=+2280.575595315" watchObservedRunningTime="2025-10-01 16:40:17.585972997 +0000 UTC m=+2280.585619832" Oct 01 16:40:20 crc kubenswrapper[4764]: I1001 16:40:20.730949 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:20 crc kubenswrapper[4764]: I1001 16:40:20.731847 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:20 crc kubenswrapper[4764]: I1001 16:40:20.790652 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:21 crc kubenswrapper[4764]: I1001 16:40:21.672723 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:22 crc kubenswrapper[4764]: I1001 16:40:22.185121 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zmsl5"] Oct 01 16:40:22 crc kubenswrapper[4764]: I1001 16:40:22.573368 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:22 crc kubenswrapper[4764]: I1001 16:40:22.573607 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:22 crc kubenswrapper[4764]: I1001 16:40:22.607232 4764 generic.go:334] "Generic (PLEG): container finished" podID="a0273bd3-26f6-44d9-a665-75c9eac2cf98" containerID="5e4e968d51e37c933ff7409d2e90a04939897952c595663e1d5beaf5a007d6ff" exitCode=0 Oct 01 16:40:22 crc kubenswrapper[4764]: I1001 16:40:22.607330 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" event={"ID":"a0273bd3-26f6-44d9-a665-75c9eac2cf98","Type":"ContainerDied","Data":"5e4e968d51e37c933ff7409d2e90a04939897952c595663e1d5beaf5a007d6ff"} Oct 01 16:40:22 crc kubenswrapper[4764]: I1001 16:40:22.642735 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:23 crc kubenswrapper[4764]: I1001 16:40:23.123276 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:23 crc kubenswrapper[4764]: I1001 16:40:23.123703 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:23 crc kubenswrapper[4764]: I1001 16:40:23.198765 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:23 crc kubenswrapper[4764]: I1001 16:40:23.620531 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zmsl5" podUID="60716b7b-5228-4e39-a1a3-cb7ced4eba4a" containerName="registry-server" containerID="cri-o://728f081cd69795a900c9282328271799f62a6b27c4b9766345dbca05ffbc3b4c" gracePeriod=2 Oct 01 16:40:23 crc kubenswrapper[4764]: I1001 16:40:23.711386 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:23 crc kubenswrapper[4764]: I1001 16:40:23.719567 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.125676 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.130879 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.229275 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-ceph\") pod \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.229377 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-utilities\") pod \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\" (UID: \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\") " Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.229479 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44zzw\" (UniqueName: \"kubernetes.io/projected/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-kube-api-access-44zzw\") pod \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\" (UID: \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\") " Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.229595 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5l49\" (UniqueName: \"kubernetes.io/projected/a0273bd3-26f6-44d9-a665-75c9eac2cf98-kube-api-access-d5l49\") pod \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.229658 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-catalog-content\") pod \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\" (UID: \"60716b7b-5228-4e39-a1a3-cb7ced4eba4a\") " Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.229878 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-inventory\") pod \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.229932 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-ssh-key\") pod \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\" (UID: \"a0273bd3-26f6-44d9-a665-75c9eac2cf98\") " Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.233262 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-utilities" (OuterVolumeSpecName: "utilities") pod "60716b7b-5228-4e39-a1a3-cb7ced4eba4a" (UID: "60716b7b-5228-4e39-a1a3-cb7ced4eba4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.236963 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0273bd3-26f6-44d9-a665-75c9eac2cf98-kube-api-access-d5l49" (OuterVolumeSpecName: "kube-api-access-d5l49") pod "a0273bd3-26f6-44d9-a665-75c9eac2cf98" (UID: "a0273bd3-26f6-44d9-a665-75c9eac2cf98"). InnerVolumeSpecName "kube-api-access-d5l49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.238348 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-kube-api-access-44zzw" (OuterVolumeSpecName: "kube-api-access-44zzw") pod "60716b7b-5228-4e39-a1a3-cb7ced4eba4a" (UID: "60716b7b-5228-4e39-a1a3-cb7ced4eba4a"). InnerVolumeSpecName "kube-api-access-44zzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.253187 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-ceph" (OuterVolumeSpecName: "ceph") pod "a0273bd3-26f6-44d9-a665-75c9eac2cf98" (UID: "a0273bd3-26f6-44d9-a665-75c9eac2cf98"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.261200 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-inventory" (OuterVolumeSpecName: "inventory") pod "a0273bd3-26f6-44d9-a665-75c9eac2cf98" (UID: "a0273bd3-26f6-44d9-a665-75c9eac2cf98"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.263110 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a0273bd3-26f6-44d9-a665-75c9eac2cf98" (UID: "a0273bd3-26f6-44d9-a665-75c9eac2cf98"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.270682 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60716b7b-5228-4e39-a1a3-cb7ced4eba4a" (UID: "60716b7b-5228-4e39-a1a3-cb7ced4eba4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.331961 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.331996 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.332008 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0273bd3-26f6-44d9-a665-75c9eac2cf98-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.332019 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.332032 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44zzw\" (UniqueName: \"kubernetes.io/projected/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-kube-api-access-44zzw\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.332059 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5l49\" (UniqueName: \"kubernetes.io/projected/a0273bd3-26f6-44d9-a665-75c9eac2cf98-kube-api-access-d5l49\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.332070 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60716b7b-5228-4e39-a1a3-cb7ced4eba4a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.633526 4764 generic.go:334] "Generic (PLEG): container finished" podID="60716b7b-5228-4e39-a1a3-cb7ced4eba4a" containerID="728f081cd69795a900c9282328271799f62a6b27c4b9766345dbca05ffbc3b4c" exitCode=0 Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.633607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmsl5" event={"ID":"60716b7b-5228-4e39-a1a3-cb7ced4eba4a","Type":"ContainerDied","Data":"728f081cd69795a900c9282328271799f62a6b27c4b9766345dbca05ffbc3b4c"} Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.633638 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zmsl5" event={"ID":"60716b7b-5228-4e39-a1a3-cb7ced4eba4a","Type":"ContainerDied","Data":"8006268e28738be80fd0ac486932e443c9b22f614b6d59eba408ac7f315c370c"} Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.633660 4764 scope.go:117] "RemoveContainer" containerID="728f081cd69795a900c9282328271799f62a6b27c4b9766345dbca05ffbc3b4c" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.634025 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zmsl5" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.636797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" event={"ID":"a0273bd3-26f6-44d9-a665-75c9eac2cf98","Type":"ContainerDied","Data":"36664972efa0a4fc9995da2ab68fe9e83cc91a965a8831ebb5a4a9ddac05733a"} Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.636886 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36664972efa0a4fc9995da2ab68fe9e83cc91a965a8831ebb5a4a9ddac05733a" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.636997 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wtrjr" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.670759 4764 scope.go:117] "RemoveContainer" containerID="9cfd1fb45dbad8cf32af9db28aab663aac2d38eabdebca3d4adb0d2123bce611" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.676964 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zmsl5"] Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.687099 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zmsl5"] Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.695315 4764 scope.go:117] "RemoveContainer" containerID="9b0660bbe93684320cefc7196ebf186800e47f235b097b3aaf154a9a8eedabb6" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.714538 4764 scope.go:117] "RemoveContainer" containerID="728f081cd69795a900c9282328271799f62a6b27c4b9766345dbca05ffbc3b4c" Oct 01 16:40:24 crc kubenswrapper[4764]: E1001 16:40:24.715513 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"728f081cd69795a900c9282328271799f62a6b27c4b9766345dbca05ffbc3b4c\": container with ID starting with 728f081cd69795a900c9282328271799f62a6b27c4b9766345dbca05ffbc3b4c not found: ID does not exist" containerID="728f081cd69795a900c9282328271799f62a6b27c4b9766345dbca05ffbc3b4c" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.715751 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"728f081cd69795a900c9282328271799f62a6b27c4b9766345dbca05ffbc3b4c"} err="failed to get container status \"728f081cd69795a900c9282328271799f62a6b27c4b9766345dbca05ffbc3b4c\": rpc error: code = NotFound desc = could not find container \"728f081cd69795a900c9282328271799f62a6b27c4b9766345dbca05ffbc3b4c\": container with ID starting with 728f081cd69795a900c9282328271799f62a6b27c4b9766345dbca05ffbc3b4c not found: ID does not exist" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.715934 4764 scope.go:117] "RemoveContainer" containerID="9cfd1fb45dbad8cf32af9db28aab663aac2d38eabdebca3d4adb0d2123bce611" Oct 01 16:40:24 crc kubenswrapper[4764]: E1001 16:40:24.716657 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cfd1fb45dbad8cf32af9db28aab663aac2d38eabdebca3d4adb0d2123bce611\": container with ID starting with 9cfd1fb45dbad8cf32af9db28aab663aac2d38eabdebca3d4adb0d2123bce611 not found: ID does not exist" containerID="9cfd1fb45dbad8cf32af9db28aab663aac2d38eabdebca3d4adb0d2123bce611" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.716698 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cfd1fb45dbad8cf32af9db28aab663aac2d38eabdebca3d4adb0d2123bce611"} err="failed to get container status \"9cfd1fb45dbad8cf32af9db28aab663aac2d38eabdebca3d4adb0d2123bce611\": rpc error: code = NotFound desc = could not find container \"9cfd1fb45dbad8cf32af9db28aab663aac2d38eabdebca3d4adb0d2123bce611\": container with ID starting with 9cfd1fb45dbad8cf32af9db28aab663aac2d38eabdebca3d4adb0d2123bce611 not found: ID does not exist" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.716725 4764 scope.go:117] "RemoveContainer" containerID="9b0660bbe93684320cefc7196ebf186800e47f235b097b3aaf154a9a8eedabb6" Oct 01 16:40:24 crc kubenswrapper[4764]: E1001 16:40:24.717188 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b0660bbe93684320cefc7196ebf186800e47f235b097b3aaf154a9a8eedabb6\": container with ID starting with 9b0660bbe93684320cefc7196ebf186800e47f235b097b3aaf154a9a8eedabb6 not found: ID does not exist" containerID="9b0660bbe93684320cefc7196ebf186800e47f235b097b3aaf154a9a8eedabb6" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.717219 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0660bbe93684320cefc7196ebf186800e47f235b097b3aaf154a9a8eedabb6"} err="failed to get container status \"9b0660bbe93684320cefc7196ebf186800e47f235b097b3aaf154a9a8eedabb6\": rpc error: code = NotFound desc = could not find container \"9b0660bbe93684320cefc7196ebf186800e47f235b097b3aaf154a9a8eedabb6\": container with ID starting with 9b0660bbe93684320cefc7196ebf186800e47f235b097b3aaf154a9a8eedabb6 not found: ID does not exist" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.739728 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl"] Oct 01 16:40:24 crc kubenswrapper[4764]: E1001 16:40:24.740105 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0273bd3-26f6-44d9-a665-75c9eac2cf98" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.740119 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0273bd3-26f6-44d9-a665-75c9eac2cf98" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:40:24 crc kubenswrapper[4764]: E1001 16:40:24.740137 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60716b7b-5228-4e39-a1a3-cb7ced4eba4a" containerName="registry-server" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.740143 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60716b7b-5228-4e39-a1a3-cb7ced4eba4a" containerName="registry-server" Oct 01 16:40:24 crc kubenswrapper[4764]: E1001 16:40:24.740156 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60716b7b-5228-4e39-a1a3-cb7ced4eba4a" containerName="extract-content" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.740162 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60716b7b-5228-4e39-a1a3-cb7ced4eba4a" containerName="extract-content" Oct 01 16:40:24 crc kubenswrapper[4764]: E1001 16:40:24.740175 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60716b7b-5228-4e39-a1a3-cb7ced4eba4a" containerName="extract-utilities" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.740181 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60716b7b-5228-4e39-a1a3-cb7ced4eba4a" containerName="extract-utilities" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.740341 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0273bd3-26f6-44d9-a665-75c9eac2cf98" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.740353 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="60716b7b-5228-4e39-a1a3-cb7ced4eba4a" containerName="registry-server" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.740916 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.746438 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.746475 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.746519 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.746491 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.746656 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.754260 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl"] Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.841559 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.841600 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-725dj\" (UniqueName: \"kubernetes.io/projected/c353fd70-5d43-4e79-9863-9d1c4156df15-kube-api-access-725dj\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.841648 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.841696 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.942356 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.942403 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-725dj\" (UniqueName: \"kubernetes.io/projected/c353fd70-5d43-4e79-9863-9d1c4156df15-kube-api-access-725dj\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.942454 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.942484 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.947600 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.947904 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.954794 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.958862 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-725dj\" (UniqueName: \"kubernetes.io/projected/c353fd70-5d43-4e79-9863-9d1c4156df15-kube-api-access-725dj\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:24 crc kubenswrapper[4764]: I1001 16:40:24.988659 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6kht"] Oct 01 16:40:25 crc kubenswrapper[4764]: I1001 16:40:25.110484 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:25 crc kubenswrapper[4764]: I1001 16:40:25.635550 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl"] Oct 01 16:40:25 crc kubenswrapper[4764]: I1001 16:40:25.650818 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z6kht" podUID="0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" containerName="registry-server" containerID="cri-o://e51396c5c09fa9d38d44571a18a7a58fe9f33041a588fb339f958d92c89f73e3" gracePeriod=2 Oct 01 16:40:25 crc kubenswrapper[4764]: I1001 16:40:25.733275 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60716b7b-5228-4e39-a1a3-cb7ced4eba4a" path="/var/lib/kubelet/pods/60716b7b-5228-4e39-a1a3-cb7ced4eba4a/volumes" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.163248 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.260992 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqcj4\" (UniqueName: \"kubernetes.io/projected/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-kube-api-access-gqcj4\") pod \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\" (UID: \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\") " Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.261289 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-catalog-content\") pod \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\" (UID: \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\") " Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.261379 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-utilities\") pod \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\" (UID: \"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16\") " Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.263057 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-utilities" (OuterVolumeSpecName: "utilities") pod "0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" (UID: "0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.269879 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-kube-api-access-gqcj4" (OuterVolumeSpecName: "kube-api-access-gqcj4") pod "0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" (UID: "0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16"). InnerVolumeSpecName "kube-api-access-gqcj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.275390 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" (UID: "0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.363770 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.363809 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.363821 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqcj4\" (UniqueName: \"kubernetes.io/projected/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16-kube-api-access-gqcj4\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.383356 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zb9mv"] Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.383588 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zb9mv" podUID="4445f728-62ad-4c70-9fc8-95d674e4680e" containerName="registry-server" containerID="cri-o://57ac2a7eadcfb6866b51e562622e0398b8c650424c7d5bf6dddb632ba939c330" gracePeriod=2 Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.664549 4764 generic.go:334] "Generic (PLEG): container finished" podID="4445f728-62ad-4c70-9fc8-95d674e4680e" containerID="57ac2a7eadcfb6866b51e562622e0398b8c650424c7d5bf6dddb632ba939c330" exitCode=0 Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.664623 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zb9mv" event={"ID":"4445f728-62ad-4c70-9fc8-95d674e4680e","Type":"ContainerDied","Data":"57ac2a7eadcfb6866b51e562622e0398b8c650424c7d5bf6dddb632ba939c330"} Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.666897 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" event={"ID":"c353fd70-5d43-4e79-9863-9d1c4156df15","Type":"ContainerStarted","Data":"1137bc821845fb59a34fff77df8ffcedcf9608c7397246af00db6d4a19bd59b5"} Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.666923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" event={"ID":"c353fd70-5d43-4e79-9863-9d1c4156df15","Type":"ContainerStarted","Data":"ce76ce8ad8ed7da2bfb20fdfbab2fb9f3bf3b967d4d5ae281c508b4670a4bd66"} Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.670930 4764 generic.go:334] "Generic (PLEG): container finished" podID="0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" containerID="e51396c5c09fa9d38d44571a18a7a58fe9f33041a588fb339f958d92c89f73e3" exitCode=0 Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.670988 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6kht" event={"ID":"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16","Type":"ContainerDied","Data":"e51396c5c09fa9d38d44571a18a7a58fe9f33041a588fb339f958d92c89f73e3"} Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.671231 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6kht" event={"ID":"0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16","Type":"ContainerDied","Data":"8984af6cb2f45bb8d59aca70b0cee640f8b246387cd51ca1bd93f905eb595951"} Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.671259 4764 scope.go:117] "RemoveContainer" containerID="e51396c5c09fa9d38d44571a18a7a58fe9f33041a588fb339f958d92c89f73e3" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.671415 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6kht" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.704842 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" podStartSLOduration=2.247647929 podStartE2EDuration="2.704817815s" podCreationTimestamp="2025-10-01 16:40:24 +0000 UTC" firstStartedPulling="2025-10-01 16:40:25.644756899 +0000 UTC m=+2288.644403785" lastFinishedPulling="2025-10-01 16:40:26.101926816 +0000 UTC m=+2289.101573671" observedRunningTime="2025-10-01 16:40:26.686194217 +0000 UTC m=+2289.685841052" watchObservedRunningTime="2025-10-01 16:40:26.704817815 +0000 UTC m=+2289.704464650" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.716852 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6kht"] Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.720029 4764 scope.go:117] "RemoveContainer" containerID="23f8faf350891768a0d35139bd41849674cf9be8268a987c987376d7947e2f28" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.724355 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6kht"] Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.739823 4764 scope.go:117] "RemoveContainer" containerID="3b48a2eaaf2210f3dfe8274640972c7e6dd89e9bd05a0a848389851786ba70b6" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.759198 4764 scope.go:117] "RemoveContainer" containerID="e51396c5c09fa9d38d44571a18a7a58fe9f33041a588fb339f958d92c89f73e3" Oct 01 16:40:26 crc kubenswrapper[4764]: E1001 16:40:26.759698 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e51396c5c09fa9d38d44571a18a7a58fe9f33041a588fb339f958d92c89f73e3\": container with ID starting with e51396c5c09fa9d38d44571a18a7a58fe9f33041a588fb339f958d92c89f73e3 not found: ID does not exist" containerID="e51396c5c09fa9d38d44571a18a7a58fe9f33041a588fb339f958d92c89f73e3" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.759752 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e51396c5c09fa9d38d44571a18a7a58fe9f33041a588fb339f958d92c89f73e3"} err="failed to get container status \"e51396c5c09fa9d38d44571a18a7a58fe9f33041a588fb339f958d92c89f73e3\": rpc error: code = NotFound desc = could not find container \"e51396c5c09fa9d38d44571a18a7a58fe9f33041a588fb339f958d92c89f73e3\": container with ID starting with e51396c5c09fa9d38d44571a18a7a58fe9f33041a588fb339f958d92c89f73e3 not found: ID does not exist" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.759786 4764 scope.go:117] "RemoveContainer" containerID="23f8faf350891768a0d35139bd41849674cf9be8268a987c987376d7947e2f28" Oct 01 16:40:26 crc kubenswrapper[4764]: E1001 16:40:26.760352 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23f8faf350891768a0d35139bd41849674cf9be8268a987c987376d7947e2f28\": container with ID starting with 23f8faf350891768a0d35139bd41849674cf9be8268a987c987376d7947e2f28 not found: ID does not exist" containerID="23f8faf350891768a0d35139bd41849674cf9be8268a987c987376d7947e2f28" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.760457 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f8faf350891768a0d35139bd41849674cf9be8268a987c987376d7947e2f28"} err="failed to get container status \"23f8faf350891768a0d35139bd41849674cf9be8268a987c987376d7947e2f28\": rpc error: code = NotFound desc = could not find container \"23f8faf350891768a0d35139bd41849674cf9be8268a987c987376d7947e2f28\": container with ID starting with 23f8faf350891768a0d35139bd41849674cf9be8268a987c987376d7947e2f28 not found: ID does not exist" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.760520 4764 scope.go:117] "RemoveContainer" containerID="3b48a2eaaf2210f3dfe8274640972c7e6dd89e9bd05a0a848389851786ba70b6" Oct 01 16:40:26 crc kubenswrapper[4764]: E1001 16:40:26.760963 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b48a2eaaf2210f3dfe8274640972c7e6dd89e9bd05a0a848389851786ba70b6\": container with ID starting with 3b48a2eaaf2210f3dfe8274640972c7e6dd89e9bd05a0a848389851786ba70b6 not found: ID does not exist" containerID="3b48a2eaaf2210f3dfe8274640972c7e6dd89e9bd05a0a848389851786ba70b6" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.761007 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b48a2eaaf2210f3dfe8274640972c7e6dd89e9bd05a0a848389851786ba70b6"} err="failed to get container status \"3b48a2eaaf2210f3dfe8274640972c7e6dd89e9bd05a0a848389851786ba70b6\": rpc error: code = NotFound desc = could not find container \"3b48a2eaaf2210f3dfe8274640972c7e6dd89e9bd05a0a848389851786ba70b6\": container with ID starting with 3b48a2eaaf2210f3dfe8274640972c7e6dd89e9bd05a0a848389851786ba70b6 not found: ID does not exist" Oct 01 16:40:26 crc kubenswrapper[4764]: I1001 16:40:26.875187 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.079610 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4445f728-62ad-4c70-9fc8-95d674e4680e-utilities\") pod \"4445f728-62ad-4c70-9fc8-95d674e4680e\" (UID: \"4445f728-62ad-4c70-9fc8-95d674e4680e\") " Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.080193 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbm4b\" (UniqueName: \"kubernetes.io/projected/4445f728-62ad-4c70-9fc8-95d674e4680e-kube-api-access-lbm4b\") pod \"4445f728-62ad-4c70-9fc8-95d674e4680e\" (UID: \"4445f728-62ad-4c70-9fc8-95d674e4680e\") " Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.080292 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4445f728-62ad-4c70-9fc8-95d674e4680e-catalog-content\") pod \"4445f728-62ad-4c70-9fc8-95d674e4680e\" (UID: \"4445f728-62ad-4c70-9fc8-95d674e4680e\") " Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.081333 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4445f728-62ad-4c70-9fc8-95d674e4680e-utilities" (OuterVolumeSpecName: "utilities") pod "4445f728-62ad-4c70-9fc8-95d674e4680e" (UID: "4445f728-62ad-4c70-9fc8-95d674e4680e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.084695 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4445f728-62ad-4c70-9fc8-95d674e4680e-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.085717 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4445f728-62ad-4c70-9fc8-95d674e4680e-kube-api-access-lbm4b" (OuterVolumeSpecName: "kube-api-access-lbm4b") pod "4445f728-62ad-4c70-9fc8-95d674e4680e" (UID: "4445f728-62ad-4c70-9fc8-95d674e4680e"). InnerVolumeSpecName "kube-api-access-lbm4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.127525 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4445f728-62ad-4c70-9fc8-95d674e4680e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4445f728-62ad-4c70-9fc8-95d674e4680e" (UID: "4445f728-62ad-4c70-9fc8-95d674e4680e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.185826 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbm4b\" (UniqueName: \"kubernetes.io/projected/4445f728-62ad-4c70-9fc8-95d674e4680e-kube-api-access-lbm4b\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.185863 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4445f728-62ad-4c70-9fc8-95d674e4680e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.688777 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zb9mv" event={"ID":"4445f728-62ad-4c70-9fc8-95d674e4680e","Type":"ContainerDied","Data":"e5e61eac0628a8fd376521eb6a07e09282d37979059f3962633028778f74d8a8"} Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.688862 4764 scope.go:117] "RemoveContainer" containerID="57ac2a7eadcfb6866b51e562622e0398b8c650424c7d5bf6dddb632ba939c330" Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.688791 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zb9mv" Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.737641 4764 scope.go:117] "RemoveContainer" containerID="a7530ce3074a264e2d399b51b68e3f014c534d6785dfb0371ef75f38a94f43ba" Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.745774 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" path="/var/lib/kubelet/pods/0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16/volumes" Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.748275 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zb9mv"] Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.760127 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zb9mv"] Oct 01 16:40:27 crc kubenswrapper[4764]: I1001 16:40:27.774861 4764 scope.go:117] "RemoveContainer" containerID="ab12295f58fb4c424a36ff9990b49f282947a0428463238922beecc8793e9bb9" Oct 01 16:40:29 crc kubenswrapper[4764]: I1001 16:40:29.741599 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4445f728-62ad-4c70-9fc8-95d674e4680e" path="/var/lib/kubelet/pods/4445f728-62ad-4c70-9fc8-95d674e4680e/volumes" Oct 01 16:40:30 crc kubenswrapper[4764]: I1001 16:40:30.734612 4764 generic.go:334] "Generic (PLEG): container finished" podID="c353fd70-5d43-4e79-9863-9d1c4156df15" containerID="1137bc821845fb59a34fff77df8ffcedcf9608c7397246af00db6d4a19bd59b5" exitCode=0 Oct 01 16:40:30 crc kubenswrapper[4764]: I1001 16:40:30.734951 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" event={"ID":"c353fd70-5d43-4e79-9863-9d1c4156df15","Type":"ContainerDied","Data":"1137bc821845fb59a34fff77df8ffcedcf9608c7397246af00db6d4a19bd59b5"} Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.184928 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.192977 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-inventory\") pod \"c353fd70-5d43-4e79-9863-9d1c4156df15\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.193084 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-ceph\") pod \"c353fd70-5d43-4e79-9863-9d1c4156df15\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.193205 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-725dj\" (UniqueName: \"kubernetes.io/projected/c353fd70-5d43-4e79-9863-9d1c4156df15-kube-api-access-725dj\") pod \"c353fd70-5d43-4e79-9863-9d1c4156df15\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.193408 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-ssh-key\") pod \"c353fd70-5d43-4e79-9863-9d1c4156df15\" (UID: \"c353fd70-5d43-4e79-9863-9d1c4156df15\") " Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.203967 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c353fd70-5d43-4e79-9863-9d1c4156df15-kube-api-access-725dj" (OuterVolumeSpecName: "kube-api-access-725dj") pod "c353fd70-5d43-4e79-9863-9d1c4156df15" (UID: "c353fd70-5d43-4e79-9863-9d1c4156df15"). InnerVolumeSpecName "kube-api-access-725dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.206079 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-ceph" (OuterVolumeSpecName: "ceph") pod "c353fd70-5d43-4e79-9863-9d1c4156df15" (UID: "c353fd70-5d43-4e79-9863-9d1c4156df15"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.234443 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-inventory" (OuterVolumeSpecName: "inventory") pod "c353fd70-5d43-4e79-9863-9d1c4156df15" (UID: "c353fd70-5d43-4e79-9863-9d1c4156df15"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.245187 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c353fd70-5d43-4e79-9863-9d1c4156df15" (UID: "c353fd70-5d43-4e79-9863-9d1c4156df15"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.296130 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.296163 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.296176 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c353fd70-5d43-4e79-9863-9d1c4156df15-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.296189 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-725dj\" (UniqueName: \"kubernetes.io/projected/c353fd70-5d43-4e79-9863-9d1c4156df15-kube-api-access-725dj\") on node \"crc\" DevicePath \"\"" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.756829 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" event={"ID":"c353fd70-5d43-4e79-9863-9d1c4156df15","Type":"ContainerDied","Data":"ce76ce8ad8ed7da2bfb20fdfbab2fb9f3bf3b967d4d5ae281c508b4670a4bd66"} Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.756875 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce76ce8ad8ed7da2bfb20fdfbab2fb9f3bf3b967d4d5ae281c508b4670a4bd66" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.756959 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.877596 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8"] Oct 01 16:40:32 crc kubenswrapper[4764]: E1001 16:40:32.878884 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" containerName="extract-utilities" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.879065 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" containerName="extract-utilities" Oct 01 16:40:32 crc kubenswrapper[4764]: E1001 16:40:32.879232 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c353fd70-5d43-4e79-9863-9d1c4156df15" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.879352 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c353fd70-5d43-4e79-9863-9d1c4156df15" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 01 16:40:32 crc kubenswrapper[4764]: E1001 16:40:32.879439 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" containerName="extract-content" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.879519 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" containerName="extract-content" Oct 01 16:40:32 crc kubenswrapper[4764]: E1001 16:40:32.879595 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4445f728-62ad-4c70-9fc8-95d674e4680e" containerName="extract-utilities" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.879675 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4445f728-62ad-4c70-9fc8-95d674e4680e" containerName="extract-utilities" Oct 01 16:40:32 crc kubenswrapper[4764]: E1001 16:40:32.879777 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" containerName="registry-server" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.879871 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" containerName="registry-server" Oct 01 16:40:32 crc kubenswrapper[4764]: E1001 16:40:32.879959 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4445f728-62ad-4c70-9fc8-95d674e4680e" containerName="extract-content" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.880041 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4445f728-62ad-4c70-9fc8-95d674e4680e" containerName="extract-content" Oct 01 16:40:32 crc kubenswrapper[4764]: E1001 16:40:32.880174 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4445f728-62ad-4c70-9fc8-95d674e4680e" containerName="registry-server" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.880246 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4445f728-62ad-4c70-9fc8-95d674e4680e" containerName="registry-server" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.880577 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e8e236a-9cc3-4dd1-9b1c-0bd3f21cab16" containerName="registry-server" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.880676 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4445f728-62ad-4c70-9fc8-95d674e4680e" containerName="registry-server" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.880757 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c353fd70-5d43-4e79-9863-9d1c4156df15" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.881984 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.885483 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.885914 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.886218 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.886073 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.886464 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.889840 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8"] Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.929140 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-57zm8\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.929504 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz94d\" (UniqueName: \"kubernetes.io/projected/e02e8e56-086f-4152-accb-b8ffdb55a215-kube-api-access-hz94d\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-57zm8\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.929626 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-57zm8\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:32 crc kubenswrapper[4764]: I1001 16:40:32.929702 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-57zm8\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:33 crc kubenswrapper[4764]: I1001 16:40:33.032842 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz94d\" (UniqueName: \"kubernetes.io/projected/e02e8e56-086f-4152-accb-b8ffdb55a215-kube-api-access-hz94d\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-57zm8\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:33 crc kubenswrapper[4764]: I1001 16:40:33.033156 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-57zm8\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:33 crc kubenswrapper[4764]: I1001 16:40:33.033218 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-57zm8\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:33 crc kubenswrapper[4764]: I1001 16:40:33.033836 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-57zm8\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:33 crc kubenswrapper[4764]: I1001 16:40:33.038154 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-57zm8\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:33 crc kubenswrapper[4764]: I1001 16:40:33.039541 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-57zm8\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:33 crc kubenswrapper[4764]: I1001 16:40:33.040394 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-57zm8\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:33 crc kubenswrapper[4764]: I1001 16:40:33.061651 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz94d\" (UniqueName: \"kubernetes.io/projected/e02e8e56-086f-4152-accb-b8ffdb55a215-kube-api-access-hz94d\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-57zm8\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:33 crc kubenswrapper[4764]: I1001 16:40:33.197964 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:40:33 crc kubenswrapper[4764]: I1001 16:40:33.811248 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8"] Oct 01 16:40:34 crc kubenswrapper[4764]: I1001 16:40:34.778539 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" event={"ID":"e02e8e56-086f-4152-accb-b8ffdb55a215","Type":"ContainerStarted","Data":"9753377ccc6ea65734c31273e3cba45c8563737f58457728810f6a1dc727dd27"} Oct 01 16:40:34 crc kubenswrapper[4764]: I1001 16:40:34.779154 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" event={"ID":"e02e8e56-086f-4152-accb-b8ffdb55a215","Type":"ContainerStarted","Data":"618a2db6569d3d66dc771a1aaf38dfff81d19bea95842f3de81897e39544cca9"} Oct 01 16:40:34 crc kubenswrapper[4764]: I1001 16:40:34.804753 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" podStartSLOduration=2.252848495 podStartE2EDuration="2.8047295s" podCreationTimestamp="2025-10-01 16:40:32 +0000 UTC" firstStartedPulling="2025-10-01 16:40:33.828117934 +0000 UTC m=+2296.827764779" lastFinishedPulling="2025-10-01 16:40:34.379998949 +0000 UTC m=+2297.379645784" observedRunningTime="2025-10-01 16:40:34.802631728 +0000 UTC m=+2297.802294623" watchObservedRunningTime="2025-10-01 16:40:34.8047295 +0000 UTC m=+2297.804376345" Oct 01 16:41:19 crc kubenswrapper[4764]: I1001 16:41:19.226791 4764 generic.go:334] "Generic (PLEG): container finished" podID="e02e8e56-086f-4152-accb-b8ffdb55a215" containerID="9753377ccc6ea65734c31273e3cba45c8563737f58457728810f6a1dc727dd27" exitCode=0 Oct 01 16:41:19 crc kubenswrapper[4764]: I1001 16:41:19.226885 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" event={"ID":"e02e8e56-086f-4152-accb-b8ffdb55a215","Type":"ContainerDied","Data":"9753377ccc6ea65734c31273e3cba45c8563737f58457728810f6a1dc727dd27"} Oct 01 16:41:20 crc kubenswrapper[4764]: I1001 16:41:20.617260 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:41:20 crc kubenswrapper[4764]: I1001 16:41:20.764984 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-inventory\") pod \"e02e8e56-086f-4152-accb-b8ffdb55a215\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " Oct 01 16:41:20 crc kubenswrapper[4764]: I1001 16:41:20.765151 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-ceph\") pod \"e02e8e56-086f-4152-accb-b8ffdb55a215\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " Oct 01 16:41:20 crc kubenswrapper[4764]: I1001 16:41:20.765284 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz94d\" (UniqueName: \"kubernetes.io/projected/e02e8e56-086f-4152-accb-b8ffdb55a215-kube-api-access-hz94d\") pod \"e02e8e56-086f-4152-accb-b8ffdb55a215\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " Oct 01 16:41:20 crc kubenswrapper[4764]: I1001 16:41:20.766172 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-ssh-key\") pod \"e02e8e56-086f-4152-accb-b8ffdb55a215\" (UID: \"e02e8e56-086f-4152-accb-b8ffdb55a215\") " Oct 01 16:41:20 crc kubenswrapper[4764]: I1001 16:41:20.776706 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-ceph" (OuterVolumeSpecName: "ceph") pod "e02e8e56-086f-4152-accb-b8ffdb55a215" (UID: "e02e8e56-086f-4152-accb-b8ffdb55a215"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:41:20 crc kubenswrapper[4764]: I1001 16:41:20.777236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e02e8e56-086f-4152-accb-b8ffdb55a215-kube-api-access-hz94d" (OuterVolumeSpecName: "kube-api-access-hz94d") pod "e02e8e56-086f-4152-accb-b8ffdb55a215" (UID: "e02e8e56-086f-4152-accb-b8ffdb55a215"). InnerVolumeSpecName "kube-api-access-hz94d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:41:20 crc kubenswrapper[4764]: I1001 16:41:20.806513 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-inventory" (OuterVolumeSpecName: "inventory") pod "e02e8e56-086f-4152-accb-b8ffdb55a215" (UID: "e02e8e56-086f-4152-accb-b8ffdb55a215"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:41:20 crc kubenswrapper[4764]: I1001 16:41:20.821663 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e02e8e56-086f-4152-accb-b8ffdb55a215" (UID: "e02e8e56-086f-4152-accb-b8ffdb55a215"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:41:20 crc kubenswrapper[4764]: I1001 16:41:20.879477 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:41:20 crc kubenswrapper[4764]: I1001 16:41:20.879526 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz94d\" (UniqueName: \"kubernetes.io/projected/e02e8e56-086f-4152-accb-b8ffdb55a215-kube-api-access-hz94d\") on node \"crc\" DevicePath \"\"" Oct 01 16:41:20 crc kubenswrapper[4764]: I1001 16:41:20.879546 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:41:20 crc kubenswrapper[4764]: I1001 16:41:20.879562 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e02e8e56-086f-4152-accb-b8ffdb55a215-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.247213 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" event={"ID":"e02e8e56-086f-4152-accb-b8ffdb55a215","Type":"ContainerDied","Data":"618a2db6569d3d66dc771a1aaf38dfff81d19bea95842f3de81897e39544cca9"} Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.247268 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="618a2db6569d3d66dc771a1aaf38dfff81d19bea95842f3de81897e39544cca9" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.247292 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-57zm8" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.363961 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5rdtx"] Oct 01 16:41:21 crc kubenswrapper[4764]: E1001 16:41:21.364311 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02e8e56-086f-4152-accb-b8ffdb55a215" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.364328 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02e8e56-086f-4152-accb-b8ffdb55a215" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.364507 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e02e8e56-086f-4152-accb-b8ffdb55a215" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.365084 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.369567 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.369599 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.371292 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.371527 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.371744 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.387618 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5rdtx"] Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.490800 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5rdtx\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.490980 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5rdtx\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.491276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xdrh\" (UniqueName: \"kubernetes.io/projected/56678028-55b3-410f-a642-999c1f035e88-kube-api-access-7xdrh\") pod \"ssh-known-hosts-edpm-deployment-5rdtx\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.491386 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-ceph\") pod \"ssh-known-hosts-edpm-deployment-5rdtx\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.592854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-ceph\") pod \"ssh-known-hosts-edpm-deployment-5rdtx\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.592995 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5rdtx\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.593079 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5rdtx\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.593169 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xdrh\" (UniqueName: \"kubernetes.io/projected/56678028-55b3-410f-a642-999c1f035e88-kube-api-access-7xdrh\") pod \"ssh-known-hosts-edpm-deployment-5rdtx\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.599845 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-5rdtx\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.600400 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-5rdtx\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.600924 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-ceph\") pod \"ssh-known-hosts-edpm-deployment-5rdtx\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.631782 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xdrh\" (UniqueName: \"kubernetes.io/projected/56678028-55b3-410f-a642-999c1f035e88-kube-api-access-7xdrh\") pod \"ssh-known-hosts-edpm-deployment-5rdtx\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.695294 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.913678 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:41:21 crc kubenswrapper[4764]: I1001 16:41:21.913742 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:41:22 crc kubenswrapper[4764]: I1001 16:41:22.213686 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-5rdtx"] Oct 01 16:41:22 crc kubenswrapper[4764]: W1001 16:41:22.215501 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56678028_55b3_410f_a642_999c1f035e88.slice/crio-ff81432a808d947e27041e921e76c96aa629cb1a3036bee633afc62976fd5ac7 WatchSource:0}: Error finding container ff81432a808d947e27041e921e76c96aa629cb1a3036bee633afc62976fd5ac7: Status 404 returned error can't find the container with id ff81432a808d947e27041e921e76c96aa629cb1a3036bee633afc62976fd5ac7 Oct 01 16:41:22 crc kubenswrapper[4764]: I1001 16:41:22.217980 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 16:41:22 crc kubenswrapper[4764]: I1001 16:41:22.269375 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" event={"ID":"56678028-55b3-410f-a642-999c1f035e88","Type":"ContainerStarted","Data":"ff81432a808d947e27041e921e76c96aa629cb1a3036bee633afc62976fd5ac7"} Oct 01 16:41:24 crc kubenswrapper[4764]: I1001 16:41:24.292591 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" event={"ID":"56678028-55b3-410f-a642-999c1f035e88","Type":"ContainerStarted","Data":"da50ddb765b8808e95a2d678ff16dc7dbe3e688acff18ad9bbd3ddcb9ecf6c56"} Oct 01 16:41:24 crc kubenswrapper[4764]: I1001 16:41:24.319452 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" podStartSLOduration=2.227691032 podStartE2EDuration="3.319433336s" podCreationTimestamp="2025-10-01 16:41:21 +0000 UTC" firstStartedPulling="2025-10-01 16:41:22.217584074 +0000 UTC m=+2345.217230939" lastFinishedPulling="2025-10-01 16:41:23.309326408 +0000 UTC m=+2346.308973243" observedRunningTime="2025-10-01 16:41:24.31511278 +0000 UTC m=+2347.314759615" watchObservedRunningTime="2025-10-01 16:41:24.319433336 +0000 UTC m=+2347.319080161" Oct 01 16:41:34 crc kubenswrapper[4764]: I1001 16:41:34.419959 4764 generic.go:334] "Generic (PLEG): container finished" podID="56678028-55b3-410f-a642-999c1f035e88" containerID="da50ddb765b8808e95a2d678ff16dc7dbe3e688acff18ad9bbd3ddcb9ecf6c56" exitCode=0 Oct 01 16:41:34 crc kubenswrapper[4764]: I1001 16:41:34.420009 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" event={"ID":"56678028-55b3-410f-a642-999c1f035e88","Type":"ContainerDied","Data":"da50ddb765b8808e95a2d678ff16dc7dbe3e688acff18ad9bbd3ddcb9ecf6c56"} Oct 01 16:41:35 crc kubenswrapper[4764]: I1001 16:41:35.872988 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:35 crc kubenswrapper[4764]: I1001 16:41:35.907662 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-ceph\") pod \"56678028-55b3-410f-a642-999c1f035e88\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " Oct 01 16:41:35 crc kubenswrapper[4764]: I1001 16:41:35.907720 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xdrh\" (UniqueName: \"kubernetes.io/projected/56678028-55b3-410f-a642-999c1f035e88-kube-api-access-7xdrh\") pod \"56678028-55b3-410f-a642-999c1f035e88\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " Oct 01 16:41:35 crc kubenswrapper[4764]: I1001 16:41:35.907809 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-ssh-key-openstack-edpm-ipam\") pod \"56678028-55b3-410f-a642-999c1f035e88\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " Oct 01 16:41:35 crc kubenswrapper[4764]: I1001 16:41:35.907876 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-inventory-0\") pod \"56678028-55b3-410f-a642-999c1f035e88\" (UID: \"56678028-55b3-410f-a642-999c1f035e88\") " Oct 01 16:41:35 crc kubenswrapper[4764]: I1001 16:41:35.915443 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-ceph" (OuterVolumeSpecName: "ceph") pod "56678028-55b3-410f-a642-999c1f035e88" (UID: "56678028-55b3-410f-a642-999c1f035e88"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:41:35 crc kubenswrapper[4764]: I1001 16:41:35.926338 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56678028-55b3-410f-a642-999c1f035e88-kube-api-access-7xdrh" (OuterVolumeSpecName: "kube-api-access-7xdrh") pod "56678028-55b3-410f-a642-999c1f035e88" (UID: "56678028-55b3-410f-a642-999c1f035e88"). InnerVolumeSpecName "kube-api-access-7xdrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:41:35 crc kubenswrapper[4764]: I1001 16:41:35.983670 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "56678028-55b3-410f-a642-999c1f035e88" (UID: "56678028-55b3-410f-a642-999c1f035e88"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.010259 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.010298 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xdrh\" (UniqueName: \"kubernetes.io/projected/56678028-55b3-410f-a642-999c1f035e88-kube-api-access-7xdrh\") on node \"crc\" DevicePath \"\"" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.010310 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.015241 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "56678028-55b3-410f-a642-999c1f035e88" (UID: "56678028-55b3-410f-a642-999c1f035e88"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.111616 4764 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/56678028-55b3-410f-a642-999c1f035e88-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.446655 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" event={"ID":"56678028-55b3-410f-a642-999c1f035e88","Type":"ContainerDied","Data":"ff81432a808d947e27041e921e76c96aa629cb1a3036bee633afc62976fd5ac7"} Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.446703 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff81432a808d947e27041e921e76c96aa629cb1a3036bee633afc62976fd5ac7" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.446783 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-5rdtx" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.568986 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q"] Oct 01 16:41:36 crc kubenswrapper[4764]: E1001 16:41:36.572182 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56678028-55b3-410f-a642-999c1f035e88" containerName="ssh-known-hosts-edpm-deployment" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.572216 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="56678028-55b3-410f-a642-999c1f035e88" containerName="ssh-known-hosts-edpm-deployment" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.572442 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="56678028-55b3-410f-a642-999c1f035e88" containerName="ssh-known-hosts-edpm-deployment" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.573784 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.577433 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.577635 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.577700 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.578205 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.579569 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.580578 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q"] Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.621217 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bj86q\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.621301 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gndg2\" (UniqueName: \"kubernetes.io/projected/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-kube-api-access-gndg2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bj86q\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.621453 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bj86q\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.621641 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bj86q\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.723918 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bj86q\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.724269 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bj86q\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.725127 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bj86q\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.725228 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gndg2\" (UniqueName: \"kubernetes.io/projected/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-kube-api-access-gndg2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bj86q\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.728667 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bj86q\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.728908 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bj86q\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.730000 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bj86q\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.756403 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gndg2\" (UniqueName: \"kubernetes.io/projected/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-kube-api-access-gndg2\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bj86q\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:36 crc kubenswrapper[4764]: I1001 16:41:36.908526 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:37 crc kubenswrapper[4764]: I1001 16:41:37.517040 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q"] Oct 01 16:41:37 crc kubenswrapper[4764]: W1001 16:41:37.526955 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0fbe741_65d7_464f_b6c6_ecdb60f8bb21.slice/crio-3f307428d75e9d2df5fde48d17737144b8f70209a45d5eb3f3e251837bd28af9 WatchSource:0}: Error finding container 3f307428d75e9d2df5fde48d17737144b8f70209a45d5eb3f3e251837bd28af9: Status 404 returned error can't find the container with id 3f307428d75e9d2df5fde48d17737144b8f70209a45d5eb3f3e251837bd28af9 Oct 01 16:41:38 crc kubenswrapper[4764]: I1001 16:41:38.470803 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" event={"ID":"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21","Type":"ContainerStarted","Data":"3f307428d75e9d2df5fde48d17737144b8f70209a45d5eb3f3e251837bd28af9"} Oct 01 16:41:40 crc kubenswrapper[4764]: I1001 16:41:40.491072 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" event={"ID":"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21","Type":"ContainerStarted","Data":"8f4d6306a1eabb83e258e882cd79ae1992bd36975d5daee93a008f4cab6f9fb6"} Oct 01 16:41:40 crc kubenswrapper[4764]: I1001 16:41:40.514266 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" podStartSLOduration=2.847303335 podStartE2EDuration="4.514246459s" podCreationTimestamp="2025-10-01 16:41:36 +0000 UTC" firstStartedPulling="2025-10-01 16:41:37.529994277 +0000 UTC m=+2360.529641122" lastFinishedPulling="2025-10-01 16:41:39.196937411 +0000 UTC m=+2362.196584246" observedRunningTime="2025-10-01 16:41:40.510346314 +0000 UTC m=+2363.509993179" watchObservedRunningTime="2025-10-01 16:41:40.514246459 +0000 UTC m=+2363.513893304" Oct 01 16:41:48 crc kubenswrapper[4764]: I1001 16:41:48.588614 4764 generic.go:334] "Generic (PLEG): container finished" podID="a0fbe741-65d7-464f-b6c6-ecdb60f8bb21" containerID="8f4d6306a1eabb83e258e882cd79ae1992bd36975d5daee93a008f4cab6f9fb6" exitCode=0 Oct 01 16:41:48 crc kubenswrapper[4764]: I1001 16:41:48.588724 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" event={"ID":"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21","Type":"ContainerDied","Data":"8f4d6306a1eabb83e258e882cd79ae1992bd36975d5daee93a008f4cab6f9fb6"} Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.122108 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.303295 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-ssh-key\") pod \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.303355 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gndg2\" (UniqueName: \"kubernetes.io/projected/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-kube-api-access-gndg2\") pod \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.303751 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-ceph\") pod \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.303779 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-inventory\") pod \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\" (UID: \"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21\") " Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.312612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-kube-api-access-gndg2" (OuterVolumeSpecName: "kube-api-access-gndg2") pod "a0fbe741-65d7-464f-b6c6-ecdb60f8bb21" (UID: "a0fbe741-65d7-464f-b6c6-ecdb60f8bb21"). InnerVolumeSpecName "kube-api-access-gndg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.322267 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-ceph" (OuterVolumeSpecName: "ceph") pod "a0fbe741-65d7-464f-b6c6-ecdb60f8bb21" (UID: "a0fbe741-65d7-464f-b6c6-ecdb60f8bb21"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.351010 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a0fbe741-65d7-464f-b6c6-ecdb60f8bb21" (UID: "a0fbe741-65d7-464f-b6c6-ecdb60f8bb21"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.351070 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-inventory" (OuterVolumeSpecName: "inventory") pod "a0fbe741-65d7-464f-b6c6-ecdb60f8bb21" (UID: "a0fbe741-65d7-464f-b6c6-ecdb60f8bb21"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.406418 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.406461 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.406478 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.406494 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gndg2\" (UniqueName: \"kubernetes.io/projected/a0fbe741-65d7-464f-b6c6-ecdb60f8bb21-kube-api-access-gndg2\") on node \"crc\" DevicePath \"\"" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.613935 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" event={"ID":"a0fbe741-65d7-464f-b6c6-ecdb60f8bb21","Type":"ContainerDied","Data":"3f307428d75e9d2df5fde48d17737144b8f70209a45d5eb3f3e251837bd28af9"} Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.613976 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f307428d75e9d2df5fde48d17737144b8f70209a45d5eb3f3e251837bd28af9" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.614024 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bj86q" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.685711 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc"] Oct 01 16:41:50 crc kubenswrapper[4764]: E1001 16:41:50.686505 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0fbe741-65d7-464f-b6c6-ecdb60f8bb21" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.686521 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0fbe741-65d7-464f-b6c6-ecdb60f8bb21" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.686707 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0fbe741-65d7-464f-b6c6-ecdb60f8bb21" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.687354 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.689895 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.691282 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.691464 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.691715 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.697695 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc"] Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.712340 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.819324 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.819431 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4hzl\" (UniqueName: \"kubernetes.io/projected/a20f9deb-1422-462b-81d1-89cfef47f81d-kube-api-access-h4hzl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.819477 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.819640 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.921315 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.921413 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.921574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.921613 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4hzl\" (UniqueName: \"kubernetes.io/projected/a20f9deb-1422-462b-81d1-89cfef47f81d-kube-api-access-h4hzl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.927365 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.927599 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.927728 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:50 crc kubenswrapper[4764]: I1001 16:41:50.941192 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4hzl\" (UniqueName: \"kubernetes.io/projected/a20f9deb-1422-462b-81d1-89cfef47f81d-kube-api-access-h4hzl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:51 crc kubenswrapper[4764]: I1001 16:41:51.017025 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:41:51 crc kubenswrapper[4764]: I1001 16:41:51.760674 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc"] Oct 01 16:41:51 crc kubenswrapper[4764]: I1001 16:41:51.913931 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:41:51 crc kubenswrapper[4764]: I1001 16:41:51.913984 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:41:52 crc kubenswrapper[4764]: I1001 16:41:52.634254 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" event={"ID":"a20f9deb-1422-462b-81d1-89cfef47f81d","Type":"ContainerStarted","Data":"d232bf07e6408b29f987c0a8fe53ee037628c9bce907241eabe9d33e14a98462"} Oct 01 16:41:53 crc kubenswrapper[4764]: I1001 16:41:53.644559 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" event={"ID":"a20f9deb-1422-462b-81d1-89cfef47f81d","Type":"ContainerStarted","Data":"a3ff8922942c2b92d46681fe1651f2146192aee1a1698b9e4a6b9562c672799a"} Oct 01 16:41:53 crc kubenswrapper[4764]: I1001 16:41:53.673266 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" podStartSLOduration=3.092069198 podStartE2EDuration="3.673248493s" podCreationTimestamp="2025-10-01 16:41:50 +0000 UTC" firstStartedPulling="2025-10-01 16:41:51.762364154 +0000 UTC m=+2374.762010999" lastFinishedPulling="2025-10-01 16:41:52.343543449 +0000 UTC m=+2375.343190294" observedRunningTime="2025-10-01 16:41:53.665162924 +0000 UTC m=+2376.664809769" watchObservedRunningTime="2025-10-01 16:41:53.673248493 +0000 UTC m=+2376.672895338" Oct 01 16:42:03 crc kubenswrapper[4764]: I1001 16:42:03.743615 4764 generic.go:334] "Generic (PLEG): container finished" podID="a20f9deb-1422-462b-81d1-89cfef47f81d" containerID="a3ff8922942c2b92d46681fe1651f2146192aee1a1698b9e4a6b9562c672799a" exitCode=0 Oct 01 16:42:03 crc kubenswrapper[4764]: I1001 16:42:03.743736 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" event={"ID":"a20f9deb-1422-462b-81d1-89cfef47f81d","Type":"ContainerDied","Data":"a3ff8922942c2b92d46681fe1651f2146192aee1a1698b9e4a6b9562c672799a"} Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.359358 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.540318 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-ssh-key\") pod \"a20f9deb-1422-462b-81d1-89cfef47f81d\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.540481 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-ceph\") pod \"a20f9deb-1422-462b-81d1-89cfef47f81d\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.540596 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-inventory\") pod \"a20f9deb-1422-462b-81d1-89cfef47f81d\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.540657 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4hzl\" (UniqueName: \"kubernetes.io/projected/a20f9deb-1422-462b-81d1-89cfef47f81d-kube-api-access-h4hzl\") pod \"a20f9deb-1422-462b-81d1-89cfef47f81d\" (UID: \"a20f9deb-1422-462b-81d1-89cfef47f81d\") " Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.546790 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-ceph" (OuterVolumeSpecName: "ceph") pod "a20f9deb-1422-462b-81d1-89cfef47f81d" (UID: "a20f9deb-1422-462b-81d1-89cfef47f81d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.558553 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a20f9deb-1422-462b-81d1-89cfef47f81d-kube-api-access-h4hzl" (OuterVolumeSpecName: "kube-api-access-h4hzl") pod "a20f9deb-1422-462b-81d1-89cfef47f81d" (UID: "a20f9deb-1422-462b-81d1-89cfef47f81d"). InnerVolumeSpecName "kube-api-access-h4hzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.571886 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a20f9deb-1422-462b-81d1-89cfef47f81d" (UID: "a20f9deb-1422-462b-81d1-89cfef47f81d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.595298 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-inventory" (OuterVolumeSpecName: "inventory") pod "a20f9deb-1422-462b-81d1-89cfef47f81d" (UID: "a20f9deb-1422-462b-81d1-89cfef47f81d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.643004 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4hzl\" (UniqueName: \"kubernetes.io/projected/a20f9deb-1422-462b-81d1-89cfef47f81d-kube-api-access-h4hzl\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.643059 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.643072 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.643082 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a20f9deb-1422-462b-81d1-89cfef47f81d-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.770933 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" event={"ID":"a20f9deb-1422-462b-81d1-89cfef47f81d","Type":"ContainerDied","Data":"d232bf07e6408b29f987c0a8fe53ee037628c9bce907241eabe9d33e14a98462"} Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.771003 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d232bf07e6408b29f987c0a8fe53ee037628c9bce907241eabe9d33e14a98462" Oct 01 16:42:05 crc kubenswrapper[4764]: I1001 16:42:05.771027 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.027902 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f"] Oct 01 16:42:06 crc kubenswrapper[4764]: E1001 16:42:06.028487 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a20f9deb-1422-462b-81d1-89cfef47f81d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.028517 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20f9deb-1422-462b-81d1-89cfef47f81d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.028792 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a20f9deb-1422-462b-81d1-89cfef47f81d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.029749 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.034199 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.035019 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.035837 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.036345 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.036614 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.036876 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.037750 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.040335 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.045910 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f"] Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.151171 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.151252 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.151408 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.151453 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.151486 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.151628 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.151672 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.151820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.151949 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.152062 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.152094 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.152164 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.152221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jph47\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-kube-api-access-jph47\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.253352 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.253403 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.253445 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.253481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.253501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.253529 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.253551 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jph47\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-kube-api-access-jph47\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.253588 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.253615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.253648 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.253668 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.253686 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.253726 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.258615 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.258965 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.259140 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.267249 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.267935 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.268158 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.268354 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.268383 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.268997 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.269665 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.270235 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.270334 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.273778 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jph47\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-kube-api-access-jph47\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.353143 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:06 crc kubenswrapper[4764]: W1001 16:42:06.938560 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6f12828_d7f8_45a2_932c_b866030ce666.slice/crio-984ee2f7adbf443eaa913f440097fd651c6a7640dd0e464c807e6a4e5ab3d91f WatchSource:0}: Error finding container 984ee2f7adbf443eaa913f440097fd651c6a7640dd0e464c807e6a4e5ab3d91f: Status 404 returned error can't find the container with id 984ee2f7adbf443eaa913f440097fd651c6a7640dd0e464c807e6a4e5ab3d91f Oct 01 16:42:06 crc kubenswrapper[4764]: I1001 16:42:06.946803 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f"] Oct 01 16:42:07 crc kubenswrapper[4764]: I1001 16:42:07.786834 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" event={"ID":"c6f12828-d7f8-45a2-932c-b866030ce666","Type":"ContainerStarted","Data":"984ee2f7adbf443eaa913f440097fd651c6a7640dd0e464c807e6a4e5ab3d91f"} Oct 01 16:42:08 crc kubenswrapper[4764]: I1001 16:42:08.804470 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" event={"ID":"c6f12828-d7f8-45a2-932c-b866030ce666","Type":"ContainerStarted","Data":"cb2d046387c11458f6f1611c49b0a1d7b9b7555a2b444e646a712308454349fa"} Oct 01 16:42:08 crc kubenswrapper[4764]: I1001 16:42:08.839646 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" podStartSLOduration=1.601085386 podStartE2EDuration="2.839608769s" podCreationTimestamp="2025-10-01 16:42:06 +0000 UTC" firstStartedPulling="2025-10-01 16:42:06.941563905 +0000 UTC m=+2389.941210740" lastFinishedPulling="2025-10-01 16:42:08.180087258 +0000 UTC m=+2391.179734123" observedRunningTime="2025-10-01 16:42:08.828706961 +0000 UTC m=+2391.828353806" watchObservedRunningTime="2025-10-01 16:42:08.839608769 +0000 UTC m=+2391.839255654" Oct 01 16:42:21 crc kubenswrapper[4764]: I1001 16:42:21.914005 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:42:21 crc kubenswrapper[4764]: I1001 16:42:21.914788 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:42:21 crc kubenswrapper[4764]: I1001 16:42:21.914860 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:42:21 crc kubenswrapper[4764]: I1001 16:42:21.916009 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:42:21 crc kubenswrapper[4764]: I1001 16:42:21.916151 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" gracePeriod=600 Oct 01 16:42:22 crc kubenswrapper[4764]: E1001 16:42:22.066430 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:42:22 crc kubenswrapper[4764]: I1001 16:42:22.954902 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" exitCode=0 Oct 01 16:42:22 crc kubenswrapper[4764]: I1001 16:42:22.954983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb"} Oct 01 16:42:22 crc kubenswrapper[4764]: I1001 16:42:22.955456 4764 scope.go:117] "RemoveContainer" containerID="d0b778d1376a46a6d5f3e2b028c1b5a6ac86778ba307f59878807e5928f0d1b7" Oct 01 16:42:22 crc kubenswrapper[4764]: I1001 16:42:22.956213 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:42:22 crc kubenswrapper[4764]: E1001 16:42:22.956510 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:42:37 crc kubenswrapper[4764]: I1001 16:42:37.726813 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:42:37 crc kubenswrapper[4764]: E1001 16:42:37.727477 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:42:44 crc kubenswrapper[4764]: I1001 16:42:44.171864 4764 generic.go:334] "Generic (PLEG): container finished" podID="c6f12828-d7f8-45a2-932c-b866030ce666" containerID="cb2d046387c11458f6f1611c49b0a1d7b9b7555a2b444e646a712308454349fa" exitCode=0 Oct 01 16:42:44 crc kubenswrapper[4764]: I1001 16:42:44.171941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" event={"ID":"c6f12828-d7f8-45a2-932c-b866030ce666","Type":"ContainerDied","Data":"cb2d046387c11458f6f1611c49b0a1d7b9b7555a2b444e646a712308454349fa"} Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.616650 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.708109 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"c6f12828-d7f8-45a2-932c-b866030ce666\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.708181 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ssh-key\") pod \"c6f12828-d7f8-45a2-932c-b866030ce666\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.708214 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-neutron-metadata-combined-ca-bundle\") pod \"c6f12828-d7f8-45a2-932c-b866030ce666\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.708257 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"c6f12828-d7f8-45a2-932c-b866030ce666\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.708279 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-libvirt-combined-ca-bundle\") pod \"c6f12828-d7f8-45a2-932c-b866030ce666\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.708951 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ceph\") pod \"c6f12828-d7f8-45a2-932c-b866030ce666\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.709028 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jph47\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-kube-api-access-jph47\") pod \"c6f12828-d7f8-45a2-932c-b866030ce666\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.709109 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ovn-combined-ca-bundle\") pod \"c6f12828-d7f8-45a2-932c-b866030ce666\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.709138 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-bootstrap-combined-ca-bundle\") pod \"c6f12828-d7f8-45a2-932c-b866030ce666\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.709168 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-inventory\") pod \"c6f12828-d7f8-45a2-932c-b866030ce666\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.709247 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-nova-combined-ca-bundle\") pod \"c6f12828-d7f8-45a2-932c-b866030ce666\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.709282 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-ovn-default-certs-0\") pod \"c6f12828-d7f8-45a2-932c-b866030ce666\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.709339 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-repo-setup-combined-ca-bundle\") pod \"c6f12828-d7f8-45a2-932c-b866030ce666\" (UID: \"c6f12828-d7f8-45a2-932c-b866030ce666\") " Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.714419 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c6f12828-d7f8-45a2-932c-b866030ce666" (UID: "c6f12828-d7f8-45a2-932c-b866030ce666"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.714511 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "c6f12828-d7f8-45a2-932c-b866030ce666" (UID: "c6f12828-d7f8-45a2-932c-b866030ce666"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.714804 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c6f12828-d7f8-45a2-932c-b866030ce666" (UID: "c6f12828-d7f8-45a2-932c-b866030ce666"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.715007 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c6f12828-d7f8-45a2-932c-b866030ce666" (UID: "c6f12828-d7f8-45a2-932c-b866030ce666"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.715863 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c6f12828-d7f8-45a2-932c-b866030ce666" (UID: "c6f12828-d7f8-45a2-932c-b866030ce666"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.717615 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c6f12828-d7f8-45a2-932c-b866030ce666" (UID: "c6f12828-d7f8-45a2-932c-b866030ce666"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.717933 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-kube-api-access-jph47" (OuterVolumeSpecName: "kube-api-access-jph47") pod "c6f12828-d7f8-45a2-932c-b866030ce666" (UID: "c6f12828-d7f8-45a2-932c-b866030ce666"). InnerVolumeSpecName "kube-api-access-jph47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.718501 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ceph" (OuterVolumeSpecName: "ceph") pod "c6f12828-d7f8-45a2-932c-b866030ce666" (UID: "c6f12828-d7f8-45a2-932c-b866030ce666"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.720100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "c6f12828-d7f8-45a2-932c-b866030ce666" (UID: "c6f12828-d7f8-45a2-932c-b866030ce666"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.727228 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "c6f12828-d7f8-45a2-932c-b866030ce666" (UID: "c6f12828-d7f8-45a2-932c-b866030ce666"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.729240 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c6f12828-d7f8-45a2-932c-b866030ce666" (UID: "c6f12828-d7f8-45a2-932c-b866030ce666"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.736318 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-inventory" (OuterVolumeSpecName: "inventory") pod "c6f12828-d7f8-45a2-932c-b866030ce666" (UID: "c6f12828-d7f8-45a2-932c-b866030ce666"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.749005 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c6f12828-d7f8-45a2-932c-b866030ce666" (UID: "c6f12828-d7f8-45a2-932c-b866030ce666"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.811478 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.811522 4764 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.811541 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.811558 4764 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.811577 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.811595 4764 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.811614 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.811632 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.811650 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.811668 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.811688 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.811707 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c6f12828-d7f8-45a2-932c-b866030ce666-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:45 crc kubenswrapper[4764]: I1001 16:42:45.811724 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jph47\" (UniqueName: \"kubernetes.io/projected/c6f12828-d7f8-45a2-932c-b866030ce666-kube-api-access-jph47\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.194182 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" event={"ID":"c6f12828-d7f8-45a2-932c-b866030ce666","Type":"ContainerDied","Data":"984ee2f7adbf443eaa913f440097fd651c6a7640dd0e464c807e6a4e5ab3d91f"} Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.194237 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="984ee2f7adbf443eaa913f440097fd651c6a7640dd0e464c807e6a4e5ab3d91f" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.194260 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.308587 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq"] Oct 01 16:42:46 crc kubenswrapper[4764]: E1001 16:42:46.309097 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f12828-d7f8-45a2-932c-b866030ce666" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.309125 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f12828-d7f8-45a2-932c-b866030ce666" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.309399 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f12828-d7f8-45a2-932c-b866030ce666" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.310125 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.312186 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.312880 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.312897 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.314175 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.314314 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.335125 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq"] Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.423761 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.423815 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zj98\" (UniqueName: \"kubernetes.io/projected/c539a876-f4e2-41db-aa15-6a54e4ac75c6-kube-api-access-4zj98\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.423861 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.424155 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.526211 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.526285 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zj98\" (UniqueName: \"kubernetes.io/projected/c539a876-f4e2-41db-aa15-6a54e4ac75c6-kube-api-access-4zj98\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.526339 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.526427 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.531539 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-ssh-key\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.531627 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.532134 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.553587 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zj98\" (UniqueName: \"kubernetes.io/projected/c539a876-f4e2-41db-aa15-6a54e4ac75c6-kube-api-access-4zj98\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:46 crc kubenswrapper[4764]: I1001 16:42:46.642942 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:47 crc kubenswrapper[4764]: I1001 16:42:47.220371 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq"] Oct 01 16:42:47 crc kubenswrapper[4764]: W1001 16:42:47.234721 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc539a876_f4e2_41db_aa15_6a54e4ac75c6.slice/crio-2b1b408cbd3913f58f6e35e541361c08ea2d0817fae242f28f8ad93c84c05549 WatchSource:0}: Error finding container 2b1b408cbd3913f58f6e35e541361c08ea2d0817fae242f28f8ad93c84c05549: Status 404 returned error can't find the container with id 2b1b408cbd3913f58f6e35e541361c08ea2d0817fae242f28f8ad93c84c05549 Oct 01 16:42:48 crc kubenswrapper[4764]: I1001 16:42:48.218248 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" event={"ID":"c539a876-f4e2-41db-aa15-6a54e4ac75c6","Type":"ContainerStarted","Data":"2b1b408cbd3913f58f6e35e541361c08ea2d0817fae242f28f8ad93c84c05549"} Oct 01 16:42:49 crc kubenswrapper[4764]: I1001 16:42:49.227678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" event={"ID":"c539a876-f4e2-41db-aa15-6a54e4ac75c6","Type":"ContainerStarted","Data":"217351acbb8e99f5bf95dc3a33e8c8f625420373bf47ed2901a09391cd52d9c9"} Oct 01 16:42:49 crc kubenswrapper[4764]: I1001 16:42:49.251115 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" podStartSLOduration=1.678071426 podStartE2EDuration="3.2510939s" podCreationTimestamp="2025-10-01 16:42:46 +0000 UTC" firstStartedPulling="2025-10-01 16:42:47.237392734 +0000 UTC m=+2430.237039569" lastFinishedPulling="2025-10-01 16:42:48.810415208 +0000 UTC m=+2431.810062043" observedRunningTime="2025-10-01 16:42:49.245114563 +0000 UTC m=+2432.244761438" watchObservedRunningTime="2025-10-01 16:42:49.2510939 +0000 UTC m=+2432.250740735" Oct 01 16:42:50 crc kubenswrapper[4764]: I1001 16:42:50.728812 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:42:50 crc kubenswrapper[4764]: E1001 16:42:50.729400 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:42:55 crc kubenswrapper[4764]: I1001 16:42:55.286965 4764 generic.go:334] "Generic (PLEG): container finished" podID="c539a876-f4e2-41db-aa15-6a54e4ac75c6" containerID="217351acbb8e99f5bf95dc3a33e8c8f625420373bf47ed2901a09391cd52d9c9" exitCode=0 Oct 01 16:42:55 crc kubenswrapper[4764]: I1001 16:42:55.287036 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" event={"ID":"c539a876-f4e2-41db-aa15-6a54e4ac75c6","Type":"ContainerDied","Data":"217351acbb8e99f5bf95dc3a33e8c8f625420373bf47ed2901a09391cd52d9c9"} Oct 01 16:42:56 crc kubenswrapper[4764]: I1001 16:42:56.798262 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:56 crc kubenswrapper[4764]: I1001 16:42:56.946450 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-ssh-key\") pod \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " Oct 01 16:42:56 crc kubenswrapper[4764]: I1001 16:42:56.946546 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zj98\" (UniqueName: \"kubernetes.io/projected/c539a876-f4e2-41db-aa15-6a54e4ac75c6-kube-api-access-4zj98\") pod \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " Oct 01 16:42:56 crc kubenswrapper[4764]: I1001 16:42:56.946587 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-ceph\") pod \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " Oct 01 16:42:56 crc kubenswrapper[4764]: I1001 16:42:56.946757 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-inventory\") pod \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\" (UID: \"c539a876-f4e2-41db-aa15-6a54e4ac75c6\") " Oct 01 16:42:56 crc kubenswrapper[4764]: I1001 16:42:56.952909 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c539a876-f4e2-41db-aa15-6a54e4ac75c6-kube-api-access-4zj98" (OuterVolumeSpecName: "kube-api-access-4zj98") pod "c539a876-f4e2-41db-aa15-6a54e4ac75c6" (UID: "c539a876-f4e2-41db-aa15-6a54e4ac75c6"). InnerVolumeSpecName "kube-api-access-4zj98". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:42:56 crc kubenswrapper[4764]: I1001 16:42:56.956138 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-ceph" (OuterVolumeSpecName: "ceph") pod "c539a876-f4e2-41db-aa15-6a54e4ac75c6" (UID: "c539a876-f4e2-41db-aa15-6a54e4ac75c6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:56 crc kubenswrapper[4764]: I1001 16:42:56.979059 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-inventory" (OuterVolumeSpecName: "inventory") pod "c539a876-f4e2-41db-aa15-6a54e4ac75c6" (UID: "c539a876-f4e2-41db-aa15-6a54e4ac75c6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:56 crc kubenswrapper[4764]: I1001 16:42:56.983610 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c539a876-f4e2-41db-aa15-6a54e4ac75c6" (UID: "c539a876-f4e2-41db-aa15-6a54e4ac75c6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.049769 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zj98\" (UniqueName: \"kubernetes.io/projected/c539a876-f4e2-41db-aa15-6a54e4ac75c6-kube-api-access-4zj98\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.049845 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.049859 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.049881 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c539a876-f4e2-41db-aa15-6a54e4ac75c6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.309732 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" event={"ID":"c539a876-f4e2-41db-aa15-6a54e4ac75c6","Type":"ContainerDied","Data":"2b1b408cbd3913f58f6e35e541361c08ea2d0817fae242f28f8ad93c84c05549"} Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.309772 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.309784 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b1b408cbd3913f58f6e35e541361c08ea2d0817fae242f28f8ad93c84c05549" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.379515 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8"] Oct 01 16:42:57 crc kubenswrapper[4764]: E1001 16:42:57.379884 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c539a876-f4e2-41db-aa15-6a54e4ac75c6" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.379901 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c539a876-f4e2-41db-aa15-6a54e4ac75c6" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.380128 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c539a876-f4e2-41db-aa15-6a54e4ac75c6" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.380712 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.386168 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.386204 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.386439 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.386225 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.386492 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.386845 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.399166 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8"] Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.456608 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bssgf\" (UniqueName: \"kubernetes.io/projected/ad66f863-1f1b-40f8-8a3f-464eaf32a344-kube-api-access-bssgf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.456697 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.456830 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.456864 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.456909 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.456940 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.559027 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.559156 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.559191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.559218 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.559269 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bssgf\" (UniqueName: \"kubernetes.io/projected/ad66f863-1f1b-40f8-8a3f-464eaf32a344-kube-api-access-bssgf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.559317 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.560489 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.564520 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.564824 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.565571 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.573612 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.576829 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bssgf\" (UniqueName: \"kubernetes.io/projected/ad66f863-1f1b-40f8-8a3f-464eaf32a344-kube-api-access-bssgf\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xjmq8\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:57 crc kubenswrapper[4764]: I1001 16:42:57.701119 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:42:58 crc kubenswrapper[4764]: I1001 16:42:58.296255 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8"] Oct 01 16:42:58 crc kubenswrapper[4764]: I1001 16:42:58.322822 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" event={"ID":"ad66f863-1f1b-40f8-8a3f-464eaf32a344","Type":"ContainerStarted","Data":"56506bd362ae44d93ef62cec227d422edb540cfed073515206275c0faba15801"} Oct 01 16:43:00 crc kubenswrapper[4764]: I1001 16:43:00.339098 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" event={"ID":"ad66f863-1f1b-40f8-8a3f-464eaf32a344","Type":"ContainerStarted","Data":"2afdce1c21e408b8ad5820ae602eda34a72b8939e2faa24931bb35d002144594"} Oct 01 16:43:00 crc kubenswrapper[4764]: I1001 16:43:00.365189 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" podStartSLOduration=2.580567645 podStartE2EDuration="3.36515964s" podCreationTimestamp="2025-10-01 16:42:57 +0000 UTC" firstStartedPulling="2025-10-01 16:42:58.309929043 +0000 UTC m=+2441.309575878" lastFinishedPulling="2025-10-01 16:42:59.094521028 +0000 UTC m=+2442.094167873" observedRunningTime="2025-10-01 16:43:00.358694261 +0000 UTC m=+2443.358341136" watchObservedRunningTime="2025-10-01 16:43:00.36515964 +0000 UTC m=+2443.364806495" Oct 01 16:43:02 crc kubenswrapper[4764]: I1001 16:43:02.722459 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:43:02 crc kubenswrapper[4764]: E1001 16:43:02.723485 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:43:16 crc kubenswrapper[4764]: I1001 16:43:16.723040 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:43:16 crc kubenswrapper[4764]: E1001 16:43:16.724604 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:43:28 crc kubenswrapper[4764]: I1001 16:43:28.723308 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:43:28 crc kubenswrapper[4764]: E1001 16:43:28.724662 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:43:40 crc kubenswrapper[4764]: I1001 16:43:40.723256 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:43:40 crc kubenswrapper[4764]: E1001 16:43:40.725014 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:43:52 crc kubenswrapper[4764]: I1001 16:43:52.722852 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:43:52 crc kubenswrapper[4764]: E1001 16:43:52.724102 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:44:05 crc kubenswrapper[4764]: I1001 16:44:05.723461 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:44:05 crc kubenswrapper[4764]: E1001 16:44:05.724683 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:44:14 crc kubenswrapper[4764]: I1001 16:44:14.113840 4764 generic.go:334] "Generic (PLEG): container finished" podID="ad66f863-1f1b-40f8-8a3f-464eaf32a344" containerID="2afdce1c21e408b8ad5820ae602eda34a72b8939e2faa24931bb35d002144594" exitCode=0 Oct 01 16:44:14 crc kubenswrapper[4764]: I1001 16:44:14.113848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" event={"ID":"ad66f863-1f1b-40f8-8a3f-464eaf32a344","Type":"ContainerDied","Data":"2afdce1c21e408b8ad5820ae602eda34a72b8939e2faa24931bb35d002144594"} Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.676277 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.829311 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bssgf\" (UniqueName: \"kubernetes.io/projected/ad66f863-1f1b-40f8-8a3f-464eaf32a344-kube-api-access-bssgf\") pod \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.829361 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ovncontroller-config-0\") pod \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.829400 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ssh-key\") pod \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.829478 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ovn-combined-ca-bundle\") pod \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.830367 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-inventory\") pod \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.830460 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ceph\") pod \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\" (UID: \"ad66f863-1f1b-40f8-8a3f-464eaf32a344\") " Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.838229 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ceph" (OuterVolumeSpecName: "ceph") pod "ad66f863-1f1b-40f8-8a3f-464eaf32a344" (UID: "ad66f863-1f1b-40f8-8a3f-464eaf32a344"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.838680 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad66f863-1f1b-40f8-8a3f-464eaf32a344-kube-api-access-bssgf" (OuterVolumeSpecName: "kube-api-access-bssgf") pod "ad66f863-1f1b-40f8-8a3f-464eaf32a344" (UID: "ad66f863-1f1b-40f8-8a3f-464eaf32a344"). InnerVolumeSpecName "kube-api-access-bssgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.843313 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ad66f863-1f1b-40f8-8a3f-464eaf32a344" (UID: "ad66f863-1f1b-40f8-8a3f-464eaf32a344"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.866212 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-inventory" (OuterVolumeSpecName: "inventory") pod "ad66f863-1f1b-40f8-8a3f-464eaf32a344" (UID: "ad66f863-1f1b-40f8-8a3f-464eaf32a344"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.876388 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ad66f863-1f1b-40f8-8a3f-464eaf32a344" (UID: "ad66f863-1f1b-40f8-8a3f-464eaf32a344"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.887731 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ad66f863-1f1b-40f8-8a3f-464eaf32a344" (UID: "ad66f863-1f1b-40f8-8a3f-464eaf32a344"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.933808 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bssgf\" (UniqueName: \"kubernetes.io/projected/ad66f863-1f1b-40f8-8a3f-464eaf32a344-kube-api-access-bssgf\") on node \"crc\" DevicePath \"\"" Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.933846 4764 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.933859 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.933872 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.933883 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:44:15 crc kubenswrapper[4764]: I1001 16:44:15.933896 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ad66f863-1f1b-40f8-8a3f-464eaf32a344-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.143698 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.143533 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xjmq8" event={"ID":"ad66f863-1f1b-40f8-8a3f-464eaf32a344","Type":"ContainerDied","Data":"56506bd362ae44d93ef62cec227d422edb540cfed073515206275c0faba15801"} Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.143961 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56506bd362ae44d93ef62cec227d422edb540cfed073515206275c0faba15801" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.355709 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw"] Oct 01 16:44:16 crc kubenswrapper[4764]: E1001 16:44:16.356210 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad66f863-1f1b-40f8-8a3f-464eaf32a344" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.356232 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad66f863-1f1b-40f8-8a3f-464eaf32a344" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.356420 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad66f863-1f1b-40f8-8a3f-464eaf32a344" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.357028 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.360031 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.360913 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.361338 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.361589 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.361622 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.361843 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.362287 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.377871 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw"] Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.442421 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.442461 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.442481 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.442511 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.442592 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.442641 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnbz8\" (UniqueName: \"kubernetes.io/projected/199ab555-85f7-4168-9e83-a5060e006dc4-kube-api-access-hnbz8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.442681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.545239 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.545303 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.545329 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.545366 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.545425 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.545491 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnbz8\" (UniqueName: \"kubernetes.io/projected/199ab555-85f7-4168-9e83-a5060e006dc4-kube-api-access-hnbz8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.545548 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.551614 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.551857 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.552275 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.553295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.553344 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.554264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.569899 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnbz8\" (UniqueName: \"kubernetes.io/projected/199ab555-85f7-4168-9e83-a5060e006dc4-kube-api-access-hnbz8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:16 crc kubenswrapper[4764]: I1001 16:44:16.682036 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:44:17 crc kubenswrapper[4764]: I1001 16:44:17.310397 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw"] Oct 01 16:44:17 crc kubenswrapper[4764]: W1001 16:44:17.321516 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod199ab555_85f7_4168_9e83_a5060e006dc4.slice/crio-e78319b9c6ac363d3f713aaba035794423d78ded0d4d2a89a968d2663bcade91 WatchSource:0}: Error finding container e78319b9c6ac363d3f713aaba035794423d78ded0d4d2a89a968d2663bcade91: Status 404 returned error can't find the container with id e78319b9c6ac363d3f713aaba035794423d78ded0d4d2a89a968d2663bcade91 Oct 01 16:44:17 crc kubenswrapper[4764]: I1001 16:44:17.738316 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:44:17 crc kubenswrapper[4764]: E1001 16:44:17.739144 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:44:17 crc kubenswrapper[4764]: I1001 16:44:17.864806 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:44:18 crc kubenswrapper[4764]: I1001 16:44:18.175569 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" event={"ID":"199ab555-85f7-4168-9e83-a5060e006dc4","Type":"ContainerStarted","Data":"e78319b9c6ac363d3f713aaba035794423d78ded0d4d2a89a968d2663bcade91"} Oct 01 16:44:19 crc kubenswrapper[4764]: I1001 16:44:19.185477 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" event={"ID":"199ab555-85f7-4168-9e83-a5060e006dc4","Type":"ContainerStarted","Data":"009e59282d80c5dd4792ab9f12ac0efb28b005066482d9da15e7be8d960c290b"} Oct 01 16:44:19 crc kubenswrapper[4764]: I1001 16:44:19.208071 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" podStartSLOduration=2.6756824200000002 podStartE2EDuration="3.208027945s" podCreationTimestamp="2025-10-01 16:44:16 +0000 UTC" firstStartedPulling="2025-10-01 16:44:17.326819745 +0000 UTC m=+2520.326466620" lastFinishedPulling="2025-10-01 16:44:17.85916529 +0000 UTC m=+2520.858812145" observedRunningTime="2025-10-01 16:44:19.201615387 +0000 UTC m=+2522.201262222" watchObservedRunningTime="2025-10-01 16:44:19.208027945 +0000 UTC m=+2522.207674780" Oct 01 16:44:32 crc kubenswrapper[4764]: I1001 16:44:32.722014 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:44:32 crc kubenswrapper[4764]: E1001 16:44:32.722838 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:44:43 crc kubenswrapper[4764]: I1001 16:44:43.722507 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:44:43 crc kubenswrapper[4764]: E1001 16:44:43.723860 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:44:55 crc kubenswrapper[4764]: I1001 16:44:55.722826 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:44:55 crc kubenswrapper[4764]: E1001 16:44:55.723767 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.154710 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l"] Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.156537 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.158799 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.159298 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.177652 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l"] Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.247865 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqhkt\" (UniqueName: \"kubernetes.io/projected/e3dd200a-bd47-483f-b11a-e457709ba4f9-kube-api-access-zqhkt\") pod \"collect-profiles-29322285-xg84l\" (UID: \"e3dd200a-bd47-483f-b11a-e457709ba4f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.247926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3dd200a-bd47-483f-b11a-e457709ba4f9-config-volume\") pod \"collect-profiles-29322285-xg84l\" (UID: \"e3dd200a-bd47-483f-b11a-e457709ba4f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.247952 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3dd200a-bd47-483f-b11a-e457709ba4f9-secret-volume\") pod \"collect-profiles-29322285-xg84l\" (UID: \"e3dd200a-bd47-483f-b11a-e457709ba4f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.350177 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqhkt\" (UniqueName: \"kubernetes.io/projected/e3dd200a-bd47-483f-b11a-e457709ba4f9-kube-api-access-zqhkt\") pod \"collect-profiles-29322285-xg84l\" (UID: \"e3dd200a-bd47-483f-b11a-e457709ba4f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.350245 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3dd200a-bd47-483f-b11a-e457709ba4f9-config-volume\") pod \"collect-profiles-29322285-xg84l\" (UID: \"e3dd200a-bd47-483f-b11a-e457709ba4f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.350273 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3dd200a-bd47-483f-b11a-e457709ba4f9-secret-volume\") pod \"collect-profiles-29322285-xg84l\" (UID: \"e3dd200a-bd47-483f-b11a-e457709ba4f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.351586 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3dd200a-bd47-483f-b11a-e457709ba4f9-config-volume\") pod \"collect-profiles-29322285-xg84l\" (UID: \"e3dd200a-bd47-483f-b11a-e457709ba4f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.357929 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3dd200a-bd47-483f-b11a-e457709ba4f9-secret-volume\") pod \"collect-profiles-29322285-xg84l\" (UID: \"e3dd200a-bd47-483f-b11a-e457709ba4f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.375381 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqhkt\" (UniqueName: \"kubernetes.io/projected/e3dd200a-bd47-483f-b11a-e457709ba4f9-kube-api-access-zqhkt\") pod \"collect-profiles-29322285-xg84l\" (UID: \"e3dd200a-bd47-483f-b11a-e457709ba4f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.479770 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" Oct 01 16:45:00 crc kubenswrapper[4764]: I1001 16:45:00.968450 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l"] Oct 01 16:45:01 crc kubenswrapper[4764]: I1001 16:45:01.650820 4764 generic.go:334] "Generic (PLEG): container finished" podID="e3dd200a-bd47-483f-b11a-e457709ba4f9" containerID="3f0589fd95ce907aa39a8221283736793928c4f5e61fb8f4152e0ee27183c2e0" exitCode=0 Oct 01 16:45:01 crc kubenswrapper[4764]: I1001 16:45:01.650864 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" event={"ID":"e3dd200a-bd47-483f-b11a-e457709ba4f9","Type":"ContainerDied","Data":"3f0589fd95ce907aa39a8221283736793928c4f5e61fb8f4152e0ee27183c2e0"} Oct 01 16:45:01 crc kubenswrapper[4764]: I1001 16:45:01.651101 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" event={"ID":"e3dd200a-bd47-483f-b11a-e457709ba4f9","Type":"ContainerStarted","Data":"02c2136934001b99a8f79ca164d4757e5b8d6d0381764a3eb03ac6ebf593600f"} Oct 01 16:45:03 crc kubenswrapper[4764]: I1001 16:45:03.053698 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" Oct 01 16:45:03 crc kubenswrapper[4764]: I1001 16:45:03.103842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3dd200a-bd47-483f-b11a-e457709ba4f9-config-volume\") pod \"e3dd200a-bd47-483f-b11a-e457709ba4f9\" (UID: \"e3dd200a-bd47-483f-b11a-e457709ba4f9\") " Oct 01 16:45:03 crc kubenswrapper[4764]: I1001 16:45:03.104003 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqhkt\" (UniqueName: \"kubernetes.io/projected/e3dd200a-bd47-483f-b11a-e457709ba4f9-kube-api-access-zqhkt\") pod \"e3dd200a-bd47-483f-b11a-e457709ba4f9\" (UID: \"e3dd200a-bd47-483f-b11a-e457709ba4f9\") " Oct 01 16:45:03 crc kubenswrapper[4764]: I1001 16:45:03.104243 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3dd200a-bd47-483f-b11a-e457709ba4f9-secret-volume\") pod \"e3dd200a-bd47-483f-b11a-e457709ba4f9\" (UID: \"e3dd200a-bd47-483f-b11a-e457709ba4f9\") " Oct 01 16:45:03 crc kubenswrapper[4764]: I1001 16:45:03.105245 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3dd200a-bd47-483f-b11a-e457709ba4f9-config-volume" (OuterVolumeSpecName: "config-volume") pod "e3dd200a-bd47-483f-b11a-e457709ba4f9" (UID: "e3dd200a-bd47-483f-b11a-e457709ba4f9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:45:03 crc kubenswrapper[4764]: I1001 16:45:03.106800 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3dd200a-bd47-483f-b11a-e457709ba4f9-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 16:45:03 crc kubenswrapper[4764]: I1001 16:45:03.112946 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3dd200a-bd47-483f-b11a-e457709ba4f9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e3dd200a-bd47-483f-b11a-e457709ba4f9" (UID: "e3dd200a-bd47-483f-b11a-e457709ba4f9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:45:03 crc kubenswrapper[4764]: I1001 16:45:03.113642 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3dd200a-bd47-483f-b11a-e457709ba4f9-kube-api-access-zqhkt" (OuterVolumeSpecName: "kube-api-access-zqhkt") pod "e3dd200a-bd47-483f-b11a-e457709ba4f9" (UID: "e3dd200a-bd47-483f-b11a-e457709ba4f9"). InnerVolumeSpecName "kube-api-access-zqhkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:45:03 crc kubenswrapper[4764]: I1001 16:45:03.209616 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3dd200a-bd47-483f-b11a-e457709ba4f9-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 16:45:03 crc kubenswrapper[4764]: I1001 16:45:03.209678 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqhkt\" (UniqueName: \"kubernetes.io/projected/e3dd200a-bd47-483f-b11a-e457709ba4f9-kube-api-access-zqhkt\") on node \"crc\" DevicePath \"\"" Oct 01 16:45:03 crc kubenswrapper[4764]: I1001 16:45:03.667512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" event={"ID":"e3dd200a-bd47-483f-b11a-e457709ba4f9","Type":"ContainerDied","Data":"02c2136934001b99a8f79ca164d4757e5b8d6d0381764a3eb03ac6ebf593600f"} Oct 01 16:45:03 crc kubenswrapper[4764]: I1001 16:45:03.667751 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02c2136934001b99a8f79ca164d4757e5b8d6d0381764a3eb03ac6ebf593600f" Oct 01 16:45:03 crc kubenswrapper[4764]: I1001 16:45:03.667599 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322285-xg84l" Oct 01 16:45:04 crc kubenswrapper[4764]: I1001 16:45:04.128691 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s"] Oct 01 16:45:04 crc kubenswrapper[4764]: I1001 16:45:04.140286 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322240-f5k2s"] Oct 01 16:45:05 crc kubenswrapper[4764]: I1001 16:45:05.740817 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee3fedb-c6d1-421a-85f5-a46b964a47b7" path="/var/lib/kubelet/pods/0ee3fedb-c6d1-421a-85f5-a46b964a47b7/volumes" Oct 01 16:45:06 crc kubenswrapper[4764]: I1001 16:45:06.722400 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:45:06 crc kubenswrapper[4764]: E1001 16:45:06.722633 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:45:19 crc kubenswrapper[4764]: I1001 16:45:19.722017 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:45:19 crc kubenswrapper[4764]: E1001 16:45:19.724933 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:45:19 crc kubenswrapper[4764]: I1001 16:45:19.827668 4764 generic.go:334] "Generic (PLEG): container finished" podID="199ab555-85f7-4168-9e83-a5060e006dc4" containerID="009e59282d80c5dd4792ab9f12ac0efb28b005066482d9da15e7be8d960c290b" exitCode=0 Oct 01 16:45:19 crc kubenswrapper[4764]: I1001 16:45:19.827711 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" event={"ID":"199ab555-85f7-4168-9e83-a5060e006dc4","Type":"ContainerDied","Data":"009e59282d80c5dd4792ab9f12ac0efb28b005066482d9da15e7be8d960c290b"} Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.334078 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.480990 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-inventory\") pod \"199ab555-85f7-4168-9e83-a5060e006dc4\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.481249 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"199ab555-85f7-4168-9e83-a5060e006dc4\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.481362 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-ceph\") pod \"199ab555-85f7-4168-9e83-a5060e006dc4\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.481445 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-neutron-metadata-combined-ca-bundle\") pod \"199ab555-85f7-4168-9e83-a5060e006dc4\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.481472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-nova-metadata-neutron-config-0\") pod \"199ab555-85f7-4168-9e83-a5060e006dc4\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.481544 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnbz8\" (UniqueName: \"kubernetes.io/projected/199ab555-85f7-4168-9e83-a5060e006dc4-kube-api-access-hnbz8\") pod \"199ab555-85f7-4168-9e83-a5060e006dc4\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.481630 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-ssh-key\") pod \"199ab555-85f7-4168-9e83-a5060e006dc4\" (UID: \"199ab555-85f7-4168-9e83-a5060e006dc4\") " Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.486988 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-ceph" (OuterVolumeSpecName: "ceph") pod "199ab555-85f7-4168-9e83-a5060e006dc4" (UID: "199ab555-85f7-4168-9e83-a5060e006dc4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.487261 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "199ab555-85f7-4168-9e83-a5060e006dc4" (UID: "199ab555-85f7-4168-9e83-a5060e006dc4"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.488754 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199ab555-85f7-4168-9e83-a5060e006dc4-kube-api-access-hnbz8" (OuterVolumeSpecName: "kube-api-access-hnbz8") pod "199ab555-85f7-4168-9e83-a5060e006dc4" (UID: "199ab555-85f7-4168-9e83-a5060e006dc4"). InnerVolumeSpecName "kube-api-access-hnbz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.522474 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-inventory" (OuterVolumeSpecName: "inventory") pod "199ab555-85f7-4168-9e83-a5060e006dc4" (UID: "199ab555-85f7-4168-9e83-a5060e006dc4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.524745 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "199ab555-85f7-4168-9e83-a5060e006dc4" (UID: "199ab555-85f7-4168-9e83-a5060e006dc4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.535548 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "199ab555-85f7-4168-9e83-a5060e006dc4" (UID: "199ab555-85f7-4168-9e83-a5060e006dc4"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.539880 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "199ab555-85f7-4168-9e83-a5060e006dc4" (UID: "199ab555-85f7-4168-9e83-a5060e006dc4"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.585781 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.586070 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.586160 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.586226 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnbz8\" (UniqueName: \"kubernetes.io/projected/199ab555-85f7-4168-9e83-a5060e006dc4-kube-api-access-hnbz8\") on node \"crc\" DevicePath \"\"" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.586309 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.586378 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.586473 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/199ab555-85f7-4168-9e83-a5060e006dc4-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.851937 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" event={"ID":"199ab555-85f7-4168-9e83-a5060e006dc4","Type":"ContainerDied","Data":"e78319b9c6ac363d3f713aaba035794423d78ded0d4d2a89a968d2663bcade91"} Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.852415 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e78319b9c6ac363d3f713aaba035794423d78ded0d4d2a89a968d2663bcade91" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.852012 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.981918 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d"] Oct 01 16:45:21 crc kubenswrapper[4764]: E1001 16:45:21.984072 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199ab555-85f7-4168-9e83-a5060e006dc4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.984147 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="199ab555-85f7-4168-9e83-a5060e006dc4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 16:45:21 crc kubenswrapper[4764]: E1001 16:45:21.984179 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3dd200a-bd47-483f-b11a-e457709ba4f9" containerName="collect-profiles" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.984213 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3dd200a-bd47-483f-b11a-e457709ba4f9" containerName="collect-profiles" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.984858 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="199ab555-85f7-4168-9e83-a5060e006dc4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.984908 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3dd200a-bd47-483f-b11a-e457709ba4f9" containerName="collect-profiles" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.986205 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.994716 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.994813 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.994857 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.995003 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.995616 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 01 16:45:21 crc kubenswrapper[4764]: I1001 16:45:21.998332 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.001785 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d"] Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.097593 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.097648 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.097968 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.098018 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.098064 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v6gb\" (UniqueName: \"kubernetes.io/projected/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-kube-api-access-7v6gb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.098086 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.199851 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.199893 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.199934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.199975 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.200006 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v6gb\" (UniqueName: \"kubernetes.io/projected/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-kube-api-access-7v6gb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.200026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.204253 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.204298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.204421 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.204448 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.206619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.223820 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v6gb\" (UniqueName: \"kubernetes.io/projected/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-kube-api-access-7v6gb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.309114 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.844005 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d"] Oct 01 16:45:22 crc kubenswrapper[4764]: I1001 16:45:22.873682 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" event={"ID":"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7","Type":"ContainerStarted","Data":"895ee70283abb7f0dbb30a74587b1e45451a7234d8ce9a11640df61ea74ccfbf"} Oct 01 16:45:25 crc kubenswrapper[4764]: I1001 16:45:25.902273 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" event={"ID":"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7","Type":"ContainerStarted","Data":"cfb96b063da9297a253868f8f8598f44be06dac06ac0ce7edfd99b6b341bf383"} Oct 01 16:45:25 crc kubenswrapper[4764]: I1001 16:45:25.932148 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" podStartSLOduration=3.216238459 podStartE2EDuration="4.932095935s" podCreationTimestamp="2025-10-01 16:45:21 +0000 UTC" firstStartedPulling="2025-10-01 16:45:22.852593271 +0000 UTC m=+2585.852240106" lastFinishedPulling="2025-10-01 16:45:24.568450747 +0000 UTC m=+2587.568097582" observedRunningTime="2025-10-01 16:45:25.920234043 +0000 UTC m=+2588.919880888" watchObservedRunningTime="2025-10-01 16:45:25.932095935 +0000 UTC m=+2588.931742820" Oct 01 16:45:34 crc kubenswrapper[4764]: I1001 16:45:34.721979 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:45:34 crc kubenswrapper[4764]: E1001 16:45:34.723246 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:45:47 crc kubenswrapper[4764]: I1001 16:45:47.733264 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:45:47 crc kubenswrapper[4764]: E1001 16:45:47.734412 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:45:48 crc kubenswrapper[4764]: I1001 16:45:48.007250 4764 scope.go:117] "RemoveContainer" containerID="80a68769d3109f5d39ed0639e14ea526a1ac429239bbd2624ac38aff4a7b818c" Oct 01 16:46:00 crc kubenswrapper[4764]: I1001 16:46:00.721930 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:46:00 crc kubenswrapper[4764]: E1001 16:46:00.723133 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:46:12 crc kubenswrapper[4764]: I1001 16:46:12.723643 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:46:12 crc kubenswrapper[4764]: E1001 16:46:12.724739 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:46:25 crc kubenswrapper[4764]: I1001 16:46:25.722224 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:46:25 crc kubenswrapper[4764]: E1001 16:46:25.723235 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:46:34 crc kubenswrapper[4764]: I1001 16:46:34.814822 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z97cg"] Oct 01 16:46:34 crc kubenswrapper[4764]: I1001 16:46:34.818897 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:34 crc kubenswrapper[4764]: I1001 16:46:34.835850 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z97cg"] Oct 01 16:46:34 crc kubenswrapper[4764]: I1001 16:46:34.924595 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-utilities\") pod \"redhat-operators-z97cg\" (UID: \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\") " pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:34 crc kubenswrapper[4764]: I1001 16:46:34.924708 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jf5f\" (UniqueName: \"kubernetes.io/projected/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-kube-api-access-9jf5f\") pod \"redhat-operators-z97cg\" (UID: \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\") " pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:34 crc kubenswrapper[4764]: I1001 16:46:34.924751 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-catalog-content\") pod \"redhat-operators-z97cg\" (UID: \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\") " pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:35 crc kubenswrapper[4764]: I1001 16:46:35.026894 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-utilities\") pod \"redhat-operators-z97cg\" (UID: \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\") " pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:35 crc kubenswrapper[4764]: I1001 16:46:35.027003 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jf5f\" (UniqueName: \"kubernetes.io/projected/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-kube-api-access-9jf5f\") pod \"redhat-operators-z97cg\" (UID: \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\") " pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:35 crc kubenswrapper[4764]: I1001 16:46:35.027035 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-catalog-content\") pod \"redhat-operators-z97cg\" (UID: \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\") " pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:35 crc kubenswrapper[4764]: I1001 16:46:35.027606 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-catalog-content\") pod \"redhat-operators-z97cg\" (UID: \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\") " pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:35 crc kubenswrapper[4764]: I1001 16:46:35.027726 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-utilities\") pod \"redhat-operators-z97cg\" (UID: \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\") " pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:35 crc kubenswrapper[4764]: I1001 16:46:35.057405 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jf5f\" (UniqueName: \"kubernetes.io/projected/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-kube-api-access-9jf5f\") pod \"redhat-operators-z97cg\" (UID: \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\") " pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:35 crc kubenswrapper[4764]: I1001 16:46:35.158918 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:36 crc kubenswrapper[4764]: I1001 16:46:36.919487 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z97cg"] Oct 01 16:46:37 crc kubenswrapper[4764]: I1001 16:46:37.687488 4764 generic.go:334] "Generic (PLEG): container finished" podID="5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" containerID="dd8ddab89d84516352e03cdcc315f862e1c4647a212d50a51e8e1995ab708397" exitCode=0 Oct 01 16:46:37 crc kubenswrapper[4764]: I1001 16:46:37.687544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z97cg" event={"ID":"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2","Type":"ContainerDied","Data":"dd8ddab89d84516352e03cdcc315f862e1c4647a212d50a51e8e1995ab708397"} Oct 01 16:46:37 crc kubenswrapper[4764]: I1001 16:46:37.687894 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z97cg" event={"ID":"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2","Type":"ContainerStarted","Data":"127118a3ddf7e76406f3f8a5b7529110bd212dbfb62281f67c5fd4ae8110a6cb"} Oct 01 16:46:37 crc kubenswrapper[4764]: I1001 16:46:37.690174 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 16:46:37 crc kubenswrapper[4764]: I1001 16:46:37.731699 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:46:37 crc kubenswrapper[4764]: E1001 16:46:37.731947 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:46:39 crc kubenswrapper[4764]: I1001 16:46:39.711974 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z97cg" event={"ID":"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2","Type":"ContainerStarted","Data":"e0fa02a7eaaaf9f1bc685f0276e19b73f75c47ecfa3d2e617a0a8cd14790cab3"} Oct 01 16:46:40 crc kubenswrapper[4764]: I1001 16:46:40.722213 4764 generic.go:334] "Generic (PLEG): container finished" podID="5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" containerID="e0fa02a7eaaaf9f1bc685f0276e19b73f75c47ecfa3d2e617a0a8cd14790cab3" exitCode=0 Oct 01 16:46:40 crc kubenswrapper[4764]: I1001 16:46:40.722284 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z97cg" event={"ID":"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2","Type":"ContainerDied","Data":"e0fa02a7eaaaf9f1bc685f0276e19b73f75c47ecfa3d2e617a0a8cd14790cab3"} Oct 01 16:46:42 crc kubenswrapper[4764]: I1001 16:46:42.740749 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z97cg" event={"ID":"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2","Type":"ContainerStarted","Data":"dfb533cf4fc7b42adc7c2a661e91578799fa81a4a516d99bf5fd00d1f5440707"} Oct 01 16:46:42 crc kubenswrapper[4764]: I1001 16:46:42.765521 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z97cg" podStartSLOduration=4.766944264 podStartE2EDuration="8.765494768s" podCreationTimestamp="2025-10-01 16:46:34 +0000 UTC" firstStartedPulling="2025-10-01 16:46:37.689804679 +0000 UTC m=+2660.689451524" lastFinishedPulling="2025-10-01 16:46:41.688355163 +0000 UTC m=+2664.688002028" observedRunningTime="2025-10-01 16:46:42.756672922 +0000 UTC m=+2665.756319807" watchObservedRunningTime="2025-10-01 16:46:42.765494768 +0000 UTC m=+2665.765141623" Oct 01 16:46:45 crc kubenswrapper[4764]: I1001 16:46:45.159422 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:45 crc kubenswrapper[4764]: I1001 16:46:45.159708 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:46 crc kubenswrapper[4764]: I1001 16:46:46.212480 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z97cg" podUID="5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" containerName="registry-server" probeResult="failure" output=< Oct 01 16:46:46 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Oct 01 16:46:46 crc kubenswrapper[4764]: > Oct 01 16:46:52 crc kubenswrapper[4764]: I1001 16:46:52.723372 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:46:52 crc kubenswrapper[4764]: E1001 16:46:52.724601 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:46:55 crc kubenswrapper[4764]: I1001 16:46:55.208433 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:55 crc kubenswrapper[4764]: I1001 16:46:55.259312 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:55 crc kubenswrapper[4764]: I1001 16:46:55.444795 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z97cg"] Oct 01 16:46:56 crc kubenswrapper[4764]: I1001 16:46:56.904355 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z97cg" podUID="5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" containerName="registry-server" containerID="cri-o://dfb533cf4fc7b42adc7c2a661e91578799fa81a4a516d99bf5fd00d1f5440707" gracePeriod=2 Oct 01 16:46:57 crc kubenswrapper[4764]: I1001 16:46:57.901394 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:57 crc kubenswrapper[4764]: I1001 16:46:57.933738 4764 generic.go:334] "Generic (PLEG): container finished" podID="5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" containerID="dfb533cf4fc7b42adc7c2a661e91578799fa81a4a516d99bf5fd00d1f5440707" exitCode=0 Oct 01 16:46:57 crc kubenswrapper[4764]: I1001 16:46:57.933798 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z97cg" event={"ID":"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2","Type":"ContainerDied","Data":"dfb533cf4fc7b42adc7c2a661e91578799fa81a4a516d99bf5fd00d1f5440707"} Oct 01 16:46:57 crc kubenswrapper[4764]: I1001 16:46:57.933832 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z97cg" event={"ID":"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2","Type":"ContainerDied","Data":"127118a3ddf7e76406f3f8a5b7529110bd212dbfb62281f67c5fd4ae8110a6cb"} Oct 01 16:46:57 crc kubenswrapper[4764]: I1001 16:46:57.933851 4764 scope.go:117] "RemoveContainer" containerID="dfb533cf4fc7b42adc7c2a661e91578799fa81a4a516d99bf5fd00d1f5440707" Oct 01 16:46:57 crc kubenswrapper[4764]: I1001 16:46:57.934092 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z97cg" Oct 01 16:46:57 crc kubenswrapper[4764]: I1001 16:46:57.963922 4764 scope.go:117] "RemoveContainer" containerID="e0fa02a7eaaaf9f1bc685f0276e19b73f75c47ecfa3d2e617a0a8cd14790cab3" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.005316 4764 scope.go:117] "RemoveContainer" containerID="dd8ddab89d84516352e03cdcc315f862e1c4647a212d50a51e8e1995ab708397" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.028334 4764 scope.go:117] "RemoveContainer" containerID="dfb533cf4fc7b42adc7c2a661e91578799fa81a4a516d99bf5fd00d1f5440707" Oct 01 16:46:58 crc kubenswrapper[4764]: E1001 16:46:58.028808 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb533cf4fc7b42adc7c2a661e91578799fa81a4a516d99bf5fd00d1f5440707\": container with ID starting with dfb533cf4fc7b42adc7c2a661e91578799fa81a4a516d99bf5fd00d1f5440707 not found: ID does not exist" containerID="dfb533cf4fc7b42adc7c2a661e91578799fa81a4a516d99bf5fd00d1f5440707" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.028856 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb533cf4fc7b42adc7c2a661e91578799fa81a4a516d99bf5fd00d1f5440707"} err="failed to get container status \"dfb533cf4fc7b42adc7c2a661e91578799fa81a4a516d99bf5fd00d1f5440707\": rpc error: code = NotFound desc = could not find container \"dfb533cf4fc7b42adc7c2a661e91578799fa81a4a516d99bf5fd00d1f5440707\": container with ID starting with dfb533cf4fc7b42adc7c2a661e91578799fa81a4a516d99bf5fd00d1f5440707 not found: ID does not exist" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.028884 4764 scope.go:117] "RemoveContainer" containerID="e0fa02a7eaaaf9f1bc685f0276e19b73f75c47ecfa3d2e617a0a8cd14790cab3" Oct 01 16:46:58 crc kubenswrapper[4764]: E1001 16:46:58.029309 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0fa02a7eaaaf9f1bc685f0276e19b73f75c47ecfa3d2e617a0a8cd14790cab3\": container with ID starting with e0fa02a7eaaaf9f1bc685f0276e19b73f75c47ecfa3d2e617a0a8cd14790cab3 not found: ID does not exist" containerID="e0fa02a7eaaaf9f1bc685f0276e19b73f75c47ecfa3d2e617a0a8cd14790cab3" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.029362 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0fa02a7eaaaf9f1bc685f0276e19b73f75c47ecfa3d2e617a0a8cd14790cab3"} err="failed to get container status \"e0fa02a7eaaaf9f1bc685f0276e19b73f75c47ecfa3d2e617a0a8cd14790cab3\": rpc error: code = NotFound desc = could not find container \"e0fa02a7eaaaf9f1bc685f0276e19b73f75c47ecfa3d2e617a0a8cd14790cab3\": container with ID starting with e0fa02a7eaaaf9f1bc685f0276e19b73f75c47ecfa3d2e617a0a8cd14790cab3 not found: ID does not exist" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.029396 4764 scope.go:117] "RemoveContainer" containerID="dd8ddab89d84516352e03cdcc315f862e1c4647a212d50a51e8e1995ab708397" Oct 01 16:46:58 crc kubenswrapper[4764]: E1001 16:46:58.029780 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd8ddab89d84516352e03cdcc315f862e1c4647a212d50a51e8e1995ab708397\": container with ID starting with dd8ddab89d84516352e03cdcc315f862e1c4647a212d50a51e8e1995ab708397 not found: ID does not exist" containerID="dd8ddab89d84516352e03cdcc315f862e1c4647a212d50a51e8e1995ab708397" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.029819 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd8ddab89d84516352e03cdcc315f862e1c4647a212d50a51e8e1995ab708397"} err="failed to get container status \"dd8ddab89d84516352e03cdcc315f862e1c4647a212d50a51e8e1995ab708397\": rpc error: code = NotFound desc = could not find container \"dd8ddab89d84516352e03cdcc315f862e1c4647a212d50a51e8e1995ab708397\": container with ID starting with dd8ddab89d84516352e03cdcc315f862e1c4647a212d50a51e8e1995ab708397 not found: ID does not exist" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.085940 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-catalog-content\") pod \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\" (UID: \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\") " Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.086002 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jf5f\" (UniqueName: \"kubernetes.io/projected/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-kube-api-access-9jf5f\") pod \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\" (UID: \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\") " Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.086222 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-utilities\") pod \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\" (UID: \"5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2\") " Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.087842 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-utilities" (OuterVolumeSpecName: "utilities") pod "5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" (UID: "5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.096813 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-kube-api-access-9jf5f" (OuterVolumeSpecName: "kube-api-access-9jf5f") pod "5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" (UID: "5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2"). InnerVolumeSpecName "kube-api-access-9jf5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.176490 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" (UID: "5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.188798 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.188841 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.188856 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jf5f\" (UniqueName: \"kubernetes.io/projected/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2-kube-api-access-9jf5f\") on node \"crc\" DevicePath \"\"" Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.263603 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z97cg"] Oct 01 16:46:58 crc kubenswrapper[4764]: I1001 16:46:58.271889 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z97cg"] Oct 01 16:46:59 crc kubenswrapper[4764]: I1001 16:46:59.738530 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" path="/var/lib/kubelet/pods/5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2/volumes" Oct 01 16:47:03 crc kubenswrapper[4764]: I1001 16:47:03.721964 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:47:03 crc kubenswrapper[4764]: E1001 16:47:03.723169 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:47:17 crc kubenswrapper[4764]: I1001 16:47:17.730313 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:47:17 crc kubenswrapper[4764]: E1001 16:47:17.731289 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:47:31 crc kubenswrapper[4764]: I1001 16:47:31.722553 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:47:32 crc kubenswrapper[4764]: I1001 16:47:32.376322 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"9d36306da14f1c645b09c7207dbb3821f0a50fdb34b3cfb5a0ef9a2b606f993d"} Oct 01 16:49:51 crc kubenswrapper[4764]: I1001 16:49:51.913951 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:49:51 crc kubenswrapper[4764]: I1001 16:49:51.914947 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:50:06 crc kubenswrapper[4764]: I1001 16:50:06.080220 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7" containerID="cfb96b063da9297a253868f8f8598f44be06dac06ac0ce7edfd99b6b341bf383" exitCode=0 Oct 01 16:50:06 crc kubenswrapper[4764]: I1001 16:50:06.080315 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" event={"ID":"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7","Type":"ContainerDied","Data":"cfb96b063da9297a253868f8f8598f44be06dac06ac0ce7edfd99b6b341bf383"} Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.605612 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.702372 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v6gb\" (UniqueName: \"kubernetes.io/projected/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-kube-api-access-7v6gb\") pod \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.702462 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-libvirt-combined-ca-bundle\") pod \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.702609 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-ssh-key\") pod \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.702631 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-libvirt-secret-0\") pod \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.702724 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-inventory\") pod \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.702800 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-ceph\") pod \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\" (UID: \"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7\") " Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.708542 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-kube-api-access-7v6gb" (OuterVolumeSpecName: "kube-api-access-7v6gb") pod "7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7" (UID: "7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7"). InnerVolumeSpecName "kube-api-access-7v6gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.710458 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7" (UID: "7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.713215 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-ceph" (OuterVolumeSpecName: "ceph") pod "7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7" (UID: "7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.732177 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-inventory" (OuterVolumeSpecName: "inventory") pod "7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7" (UID: "7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.733090 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7" (UID: "7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.747690 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7" (UID: "7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.804759 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.804796 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.804811 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.804821 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.804833 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v6gb\" (UniqueName: \"kubernetes.io/projected/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-kube-api-access-7v6gb\") on node \"crc\" DevicePath \"\"" Oct 01 16:50:07 crc kubenswrapper[4764]: I1001 16:50:07.804845 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.129476 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" event={"ID":"7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7","Type":"ContainerDied","Data":"895ee70283abb7f0dbb30a74587b1e45451a7234d8ce9a11640df61ea74ccfbf"} Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.129888 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="895ee70283abb7f0dbb30a74587b1e45451a7234d8ce9a11640df61ea74ccfbf" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.129567 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.250229 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh"] Oct 01 16:50:08 crc kubenswrapper[4764]: E1001 16:50:08.250802 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" containerName="extract-utilities" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.250837 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" containerName="extract-utilities" Oct 01 16:50:08 crc kubenswrapper[4764]: E1001 16:50:08.250874 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" containerName="extract-content" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.250886 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" containerName="extract-content" Oct 01 16:50:08 crc kubenswrapper[4764]: E1001 16:50:08.250909 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.250922 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 16:50:08 crc kubenswrapper[4764]: E1001 16:50:08.250944 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" containerName="registry-server" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.250954 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" containerName="registry-server" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.251261 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.251310 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fcb5baf-a5d0-49d3-a4cf-1b6bd1330ff2" containerName="registry-server" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.252325 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.254039 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.254189 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.256133 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.256289 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.256455 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.257781 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-82br6" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.257904 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.258135 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.258296 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.265388 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh"] Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.315571 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.315630 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.315651 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.315684 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.315776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.315801 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.315825 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.315843 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.315858 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/473bdd59-1196-45be-931d-f452ce6bc2fa-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.315938 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.315964 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnvg5\" (UniqueName: \"kubernetes.io/projected/473bdd59-1196-45be-931d-f452ce6bc2fa-kube-api-access-jnvg5\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.418935 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.419011 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.419158 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.419186 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.419209 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/473bdd59-1196-45be-931d-f452ce6bc2fa-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.419246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.419263 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnvg5\" (UniqueName: \"kubernetes.io/projected/473bdd59-1196-45be-931d-f452ce6bc2fa-kube-api-access-jnvg5\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.419317 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.419364 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.419388 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.419423 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.420068 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.421330 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/473bdd59-1196-45be-931d-f452ce6bc2fa-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.423287 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.423557 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.423919 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-ssh-key\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.425016 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.425304 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.426907 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.429131 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.430494 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.436243 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnvg5\" (UniqueName: \"kubernetes.io/projected/473bdd59-1196-45be-931d-f452ce6bc2fa-kube-api-access-jnvg5\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:08 crc kubenswrapper[4764]: I1001 16:50:08.584382 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:50:09 crc kubenswrapper[4764]: I1001 16:50:09.100071 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh"] Oct 01 16:50:09 crc kubenswrapper[4764]: I1001 16:50:09.139257 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" event={"ID":"473bdd59-1196-45be-931d-f452ce6bc2fa","Type":"ContainerStarted","Data":"14175e17f050baae55816d7be30bcf2ef01673f2b6e373fff25cbb03c6332041"} Oct 01 16:50:10 crc kubenswrapper[4764]: I1001 16:50:10.152963 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" event={"ID":"473bdd59-1196-45be-931d-f452ce6bc2fa","Type":"ContainerStarted","Data":"8ccb4bf15aee238c259a33a49be02d51f2ef253aeecd5ca144814d24db980cf1"} Oct 01 16:50:10 crc kubenswrapper[4764]: I1001 16:50:10.190687 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" podStartSLOduration=1.703503362 podStartE2EDuration="2.190657616s" podCreationTimestamp="2025-10-01 16:50:08 +0000 UTC" firstStartedPulling="2025-10-01 16:50:09.105106574 +0000 UTC m=+2872.104753419" lastFinishedPulling="2025-10-01 16:50:09.592260828 +0000 UTC m=+2872.591907673" observedRunningTime="2025-10-01 16:50:10.180321842 +0000 UTC m=+2873.179968737" watchObservedRunningTime="2025-10-01 16:50:10.190657616 +0000 UTC m=+2873.190304491" Oct 01 16:50:21 crc kubenswrapper[4764]: I1001 16:50:21.914118 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:50:21 crc kubenswrapper[4764]: I1001 16:50:21.915011 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:50:51 crc kubenswrapper[4764]: I1001 16:50:51.913927 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:50:51 crc kubenswrapper[4764]: I1001 16:50:51.914800 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:50:51 crc kubenswrapper[4764]: I1001 16:50:51.914887 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:50:51 crc kubenswrapper[4764]: I1001 16:50:51.916193 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d36306da14f1c645b09c7207dbb3821f0a50fdb34b3cfb5a0ef9a2b606f993d"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:50:51 crc kubenswrapper[4764]: I1001 16:50:51.916303 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://9d36306da14f1c645b09c7207dbb3821f0a50fdb34b3cfb5a0ef9a2b606f993d" gracePeriod=600 Oct 01 16:50:52 crc kubenswrapper[4764]: I1001 16:50:52.650741 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="9d36306da14f1c645b09c7207dbb3821f0a50fdb34b3cfb5a0ef9a2b606f993d" exitCode=0 Oct 01 16:50:52 crc kubenswrapper[4764]: I1001 16:50:52.650795 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"9d36306da14f1c645b09c7207dbb3821f0a50fdb34b3cfb5a0ef9a2b606f993d"} Oct 01 16:50:52 crc kubenswrapper[4764]: I1001 16:50:52.651758 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a"} Oct 01 16:50:52 crc kubenswrapper[4764]: I1001 16:50:52.651791 4764 scope.go:117] "RemoveContainer" containerID="13e303d5efa9c072debe3718b4b99c3322af6fdbf53f0eca73c9de9ddf036bcb" Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.052257 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v9kbx"] Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.055242 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.077438 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v9kbx"] Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.091276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d2084b-f53c-402a-8c9e-03487f9ee23f-catalog-content\") pod \"community-operators-v9kbx\" (UID: \"60d2084b-f53c-402a-8c9e-03487f9ee23f\") " pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.091445 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d2084b-f53c-402a-8c9e-03487f9ee23f-utilities\") pod \"community-operators-v9kbx\" (UID: \"60d2084b-f53c-402a-8c9e-03487f9ee23f\") " pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.091497 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ksbl\" (UniqueName: \"kubernetes.io/projected/60d2084b-f53c-402a-8c9e-03487f9ee23f-kube-api-access-9ksbl\") pod \"community-operators-v9kbx\" (UID: \"60d2084b-f53c-402a-8c9e-03487f9ee23f\") " pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.193203 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d2084b-f53c-402a-8c9e-03487f9ee23f-utilities\") pod \"community-operators-v9kbx\" (UID: \"60d2084b-f53c-402a-8c9e-03487f9ee23f\") " pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.193266 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ksbl\" (UniqueName: \"kubernetes.io/projected/60d2084b-f53c-402a-8c9e-03487f9ee23f-kube-api-access-9ksbl\") pod \"community-operators-v9kbx\" (UID: \"60d2084b-f53c-402a-8c9e-03487f9ee23f\") " pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.193287 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d2084b-f53c-402a-8c9e-03487f9ee23f-catalog-content\") pod \"community-operators-v9kbx\" (UID: \"60d2084b-f53c-402a-8c9e-03487f9ee23f\") " pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.194384 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d2084b-f53c-402a-8c9e-03487f9ee23f-catalog-content\") pod \"community-operators-v9kbx\" (UID: \"60d2084b-f53c-402a-8c9e-03487f9ee23f\") " pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.194793 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d2084b-f53c-402a-8c9e-03487f9ee23f-utilities\") pod \"community-operators-v9kbx\" (UID: \"60d2084b-f53c-402a-8c9e-03487f9ee23f\") " pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.214534 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ksbl\" (UniqueName: \"kubernetes.io/projected/60d2084b-f53c-402a-8c9e-03487f9ee23f-kube-api-access-9ksbl\") pod \"community-operators-v9kbx\" (UID: \"60d2084b-f53c-402a-8c9e-03487f9ee23f\") " pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.393820 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:50:54 crc kubenswrapper[4764]: W1001 16:50:54.950397 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60d2084b_f53c_402a_8c9e_03487f9ee23f.slice/crio-c3f89375dfd21fd66a2f2fefd1ba9aab20056f81351607af0c9883392cf05ec9 WatchSource:0}: Error finding container c3f89375dfd21fd66a2f2fefd1ba9aab20056f81351607af0c9883392cf05ec9: Status 404 returned error can't find the container with id c3f89375dfd21fd66a2f2fefd1ba9aab20056f81351607af0c9883392cf05ec9 Oct 01 16:50:54 crc kubenswrapper[4764]: I1001 16:50:54.953501 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v9kbx"] Oct 01 16:50:55 crc kubenswrapper[4764]: I1001 16:50:55.690093 4764 generic.go:334] "Generic (PLEG): container finished" podID="60d2084b-f53c-402a-8c9e-03487f9ee23f" containerID="4b27248404400247d1d8e9838079269dad5ee3c78a9e270705c43451fddad8ac" exitCode=0 Oct 01 16:50:55 crc kubenswrapper[4764]: I1001 16:50:55.690222 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9kbx" event={"ID":"60d2084b-f53c-402a-8c9e-03487f9ee23f","Type":"ContainerDied","Data":"4b27248404400247d1d8e9838079269dad5ee3c78a9e270705c43451fddad8ac"} Oct 01 16:50:55 crc kubenswrapper[4764]: I1001 16:50:55.690426 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9kbx" event={"ID":"60d2084b-f53c-402a-8c9e-03487f9ee23f","Type":"ContainerStarted","Data":"c3f89375dfd21fd66a2f2fefd1ba9aab20056f81351607af0c9883392cf05ec9"} Oct 01 16:50:57 crc kubenswrapper[4764]: I1001 16:50:57.713552 4764 generic.go:334] "Generic (PLEG): container finished" podID="60d2084b-f53c-402a-8c9e-03487f9ee23f" containerID="fc5d54684c90481f04ac9616b00713422da443ff152e7eb0409d2b58f0ef9a52" exitCode=0 Oct 01 16:50:57 crc kubenswrapper[4764]: I1001 16:50:57.713628 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9kbx" event={"ID":"60d2084b-f53c-402a-8c9e-03487f9ee23f","Type":"ContainerDied","Data":"fc5d54684c90481f04ac9616b00713422da443ff152e7eb0409d2b58f0ef9a52"} Oct 01 16:50:58 crc kubenswrapper[4764]: I1001 16:50:58.728194 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9kbx" event={"ID":"60d2084b-f53c-402a-8c9e-03487f9ee23f","Type":"ContainerStarted","Data":"b2ffeb9d98d579475c659bba26dbc12b0e2755894f5c6c1fb705d13c85585b8a"} Oct 01 16:50:58 crc kubenswrapper[4764]: I1001 16:50:58.754589 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v9kbx" podStartSLOduration=2.186692526 podStartE2EDuration="4.754570234s" podCreationTimestamp="2025-10-01 16:50:54 +0000 UTC" firstStartedPulling="2025-10-01 16:50:55.692704654 +0000 UTC m=+2918.692351539" lastFinishedPulling="2025-10-01 16:50:58.260582372 +0000 UTC m=+2921.260229247" observedRunningTime="2025-10-01 16:50:58.748644838 +0000 UTC m=+2921.748291673" watchObservedRunningTime="2025-10-01 16:50:58.754570234 +0000 UTC m=+2921.754217069" Oct 01 16:51:04 crc kubenswrapper[4764]: I1001 16:51:04.394492 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:51:04 crc kubenswrapper[4764]: I1001 16:51:04.395315 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:51:04 crc kubenswrapper[4764]: I1001 16:51:04.477522 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:51:04 crc kubenswrapper[4764]: I1001 16:51:04.842451 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:51:04 crc kubenswrapper[4764]: I1001 16:51:04.900890 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v9kbx"] Oct 01 16:51:06 crc kubenswrapper[4764]: I1001 16:51:06.800305 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v9kbx" podUID="60d2084b-f53c-402a-8c9e-03487f9ee23f" containerName="registry-server" containerID="cri-o://b2ffeb9d98d579475c659bba26dbc12b0e2755894f5c6c1fb705d13c85585b8a" gracePeriod=2 Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.306154 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.474366 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d2084b-f53c-402a-8c9e-03487f9ee23f-utilities\") pod \"60d2084b-f53c-402a-8c9e-03487f9ee23f\" (UID: \"60d2084b-f53c-402a-8c9e-03487f9ee23f\") " Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.474650 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ksbl\" (UniqueName: \"kubernetes.io/projected/60d2084b-f53c-402a-8c9e-03487f9ee23f-kube-api-access-9ksbl\") pod \"60d2084b-f53c-402a-8c9e-03487f9ee23f\" (UID: \"60d2084b-f53c-402a-8c9e-03487f9ee23f\") " Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.474696 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d2084b-f53c-402a-8c9e-03487f9ee23f-catalog-content\") pod \"60d2084b-f53c-402a-8c9e-03487f9ee23f\" (UID: \"60d2084b-f53c-402a-8c9e-03487f9ee23f\") " Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.475205 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60d2084b-f53c-402a-8c9e-03487f9ee23f-utilities" (OuterVolumeSpecName: "utilities") pod "60d2084b-f53c-402a-8c9e-03487f9ee23f" (UID: "60d2084b-f53c-402a-8c9e-03487f9ee23f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.481544 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60d2084b-f53c-402a-8c9e-03487f9ee23f-kube-api-access-9ksbl" (OuterVolumeSpecName: "kube-api-access-9ksbl") pod "60d2084b-f53c-402a-8c9e-03487f9ee23f" (UID: "60d2084b-f53c-402a-8c9e-03487f9ee23f"). InnerVolumeSpecName "kube-api-access-9ksbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.576524 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ksbl\" (UniqueName: \"kubernetes.io/projected/60d2084b-f53c-402a-8c9e-03487f9ee23f-kube-api-access-9ksbl\") on node \"crc\" DevicePath \"\"" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.576561 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60d2084b-f53c-402a-8c9e-03487f9ee23f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.755265 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60d2084b-f53c-402a-8c9e-03487f9ee23f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60d2084b-f53c-402a-8c9e-03487f9ee23f" (UID: "60d2084b-f53c-402a-8c9e-03487f9ee23f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.781024 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60d2084b-f53c-402a-8c9e-03487f9ee23f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.814545 4764 generic.go:334] "Generic (PLEG): container finished" podID="60d2084b-f53c-402a-8c9e-03487f9ee23f" containerID="b2ffeb9d98d579475c659bba26dbc12b0e2755894f5c6c1fb705d13c85585b8a" exitCode=0 Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.814622 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v9kbx" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.814647 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9kbx" event={"ID":"60d2084b-f53c-402a-8c9e-03487f9ee23f","Type":"ContainerDied","Data":"b2ffeb9d98d579475c659bba26dbc12b0e2755894f5c6c1fb705d13c85585b8a"} Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.814783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v9kbx" event={"ID":"60d2084b-f53c-402a-8c9e-03487f9ee23f","Type":"ContainerDied","Data":"c3f89375dfd21fd66a2f2fefd1ba9aab20056f81351607af0c9883392cf05ec9"} Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.814830 4764 scope.go:117] "RemoveContainer" containerID="b2ffeb9d98d579475c659bba26dbc12b0e2755894f5c6c1fb705d13c85585b8a" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.859470 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v9kbx"] Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.863729 4764 scope.go:117] "RemoveContainer" containerID="fc5d54684c90481f04ac9616b00713422da443ff152e7eb0409d2b58f0ef9a52" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.868411 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v9kbx"] Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.888282 4764 scope.go:117] "RemoveContainer" containerID="4b27248404400247d1d8e9838079269dad5ee3c78a9e270705c43451fddad8ac" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.929644 4764 scope.go:117] "RemoveContainer" containerID="b2ffeb9d98d579475c659bba26dbc12b0e2755894f5c6c1fb705d13c85585b8a" Oct 01 16:51:07 crc kubenswrapper[4764]: E1001 16:51:07.930089 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2ffeb9d98d579475c659bba26dbc12b0e2755894f5c6c1fb705d13c85585b8a\": container with ID starting with b2ffeb9d98d579475c659bba26dbc12b0e2755894f5c6c1fb705d13c85585b8a not found: ID does not exist" containerID="b2ffeb9d98d579475c659bba26dbc12b0e2755894f5c6c1fb705d13c85585b8a" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.930127 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ffeb9d98d579475c659bba26dbc12b0e2755894f5c6c1fb705d13c85585b8a"} err="failed to get container status \"b2ffeb9d98d579475c659bba26dbc12b0e2755894f5c6c1fb705d13c85585b8a\": rpc error: code = NotFound desc = could not find container \"b2ffeb9d98d579475c659bba26dbc12b0e2755894f5c6c1fb705d13c85585b8a\": container with ID starting with b2ffeb9d98d579475c659bba26dbc12b0e2755894f5c6c1fb705d13c85585b8a not found: ID does not exist" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.930153 4764 scope.go:117] "RemoveContainer" containerID="fc5d54684c90481f04ac9616b00713422da443ff152e7eb0409d2b58f0ef9a52" Oct 01 16:51:07 crc kubenswrapper[4764]: E1001 16:51:07.930435 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc5d54684c90481f04ac9616b00713422da443ff152e7eb0409d2b58f0ef9a52\": container with ID starting with fc5d54684c90481f04ac9616b00713422da443ff152e7eb0409d2b58f0ef9a52 not found: ID does not exist" containerID="fc5d54684c90481f04ac9616b00713422da443ff152e7eb0409d2b58f0ef9a52" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.930458 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc5d54684c90481f04ac9616b00713422da443ff152e7eb0409d2b58f0ef9a52"} err="failed to get container status \"fc5d54684c90481f04ac9616b00713422da443ff152e7eb0409d2b58f0ef9a52\": rpc error: code = NotFound desc = could not find container \"fc5d54684c90481f04ac9616b00713422da443ff152e7eb0409d2b58f0ef9a52\": container with ID starting with fc5d54684c90481f04ac9616b00713422da443ff152e7eb0409d2b58f0ef9a52 not found: ID does not exist" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.930472 4764 scope.go:117] "RemoveContainer" containerID="4b27248404400247d1d8e9838079269dad5ee3c78a9e270705c43451fddad8ac" Oct 01 16:51:07 crc kubenswrapper[4764]: E1001 16:51:07.930699 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b27248404400247d1d8e9838079269dad5ee3c78a9e270705c43451fddad8ac\": container with ID starting with 4b27248404400247d1d8e9838079269dad5ee3c78a9e270705c43451fddad8ac not found: ID does not exist" containerID="4b27248404400247d1d8e9838079269dad5ee3c78a9e270705c43451fddad8ac" Oct 01 16:51:07 crc kubenswrapper[4764]: I1001 16:51:07.930740 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b27248404400247d1d8e9838079269dad5ee3c78a9e270705c43451fddad8ac"} err="failed to get container status \"4b27248404400247d1d8e9838079269dad5ee3c78a9e270705c43451fddad8ac\": rpc error: code = NotFound desc = could not find container \"4b27248404400247d1d8e9838079269dad5ee3c78a9e270705c43451fddad8ac\": container with ID starting with 4b27248404400247d1d8e9838079269dad5ee3c78a9e270705c43451fddad8ac not found: ID does not exist" Oct 01 16:51:09 crc kubenswrapper[4764]: I1001 16:51:09.738519 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60d2084b-f53c-402a-8c9e-03487f9ee23f" path="/var/lib/kubelet/pods/60d2084b-f53c-402a-8c9e-03487f9ee23f/volumes" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.050447 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zxrtg"] Oct 01 16:51:45 crc kubenswrapper[4764]: E1001 16:51:45.051904 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d2084b-f53c-402a-8c9e-03487f9ee23f" containerName="extract-utilities" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.051930 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d2084b-f53c-402a-8c9e-03487f9ee23f" containerName="extract-utilities" Oct 01 16:51:45 crc kubenswrapper[4764]: E1001 16:51:45.052001 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d2084b-f53c-402a-8c9e-03487f9ee23f" containerName="registry-server" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.052019 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d2084b-f53c-402a-8c9e-03487f9ee23f" containerName="registry-server" Oct 01 16:51:45 crc kubenswrapper[4764]: E1001 16:51:45.052091 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60d2084b-f53c-402a-8c9e-03487f9ee23f" containerName="extract-content" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.052110 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60d2084b-f53c-402a-8c9e-03487f9ee23f" containerName="extract-content" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.052537 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="60d2084b-f53c-402a-8c9e-03487f9ee23f" containerName="registry-server" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.055425 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.069494 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zxrtg"] Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.153553 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c881e455-e4f2-44b9-93a9-09549a617352-utilities\") pod \"certified-operators-zxrtg\" (UID: \"c881e455-e4f2-44b9-93a9-09549a617352\") " pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.153630 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcsmg\" (UniqueName: \"kubernetes.io/projected/c881e455-e4f2-44b9-93a9-09549a617352-kube-api-access-jcsmg\") pod \"certified-operators-zxrtg\" (UID: \"c881e455-e4f2-44b9-93a9-09549a617352\") " pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.153697 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c881e455-e4f2-44b9-93a9-09549a617352-catalog-content\") pod \"certified-operators-zxrtg\" (UID: \"c881e455-e4f2-44b9-93a9-09549a617352\") " pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.255834 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c881e455-e4f2-44b9-93a9-09549a617352-utilities\") pod \"certified-operators-zxrtg\" (UID: \"c881e455-e4f2-44b9-93a9-09549a617352\") " pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.255916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcsmg\" (UniqueName: \"kubernetes.io/projected/c881e455-e4f2-44b9-93a9-09549a617352-kube-api-access-jcsmg\") pod \"certified-operators-zxrtg\" (UID: \"c881e455-e4f2-44b9-93a9-09549a617352\") " pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.255988 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c881e455-e4f2-44b9-93a9-09549a617352-catalog-content\") pod \"certified-operators-zxrtg\" (UID: \"c881e455-e4f2-44b9-93a9-09549a617352\") " pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.256605 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c881e455-e4f2-44b9-93a9-09549a617352-catalog-content\") pod \"certified-operators-zxrtg\" (UID: \"c881e455-e4f2-44b9-93a9-09549a617352\") " pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.256685 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c881e455-e4f2-44b9-93a9-09549a617352-utilities\") pod \"certified-operators-zxrtg\" (UID: \"c881e455-e4f2-44b9-93a9-09549a617352\") " pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.277737 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcsmg\" (UniqueName: \"kubernetes.io/projected/c881e455-e4f2-44b9-93a9-09549a617352-kube-api-access-jcsmg\") pod \"certified-operators-zxrtg\" (UID: \"c881e455-e4f2-44b9-93a9-09549a617352\") " pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.391298 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:45 crc kubenswrapper[4764]: I1001 16:51:45.877338 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zxrtg"] Oct 01 16:51:46 crc kubenswrapper[4764]: I1001 16:51:46.264558 4764 generic.go:334] "Generic (PLEG): container finished" podID="c881e455-e4f2-44b9-93a9-09549a617352" containerID="cb55819296df8c0e569bc4b3476a5264b0562109ee9b723fb8a026076a3396c6" exitCode=0 Oct 01 16:51:46 crc kubenswrapper[4764]: I1001 16:51:46.264608 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxrtg" event={"ID":"c881e455-e4f2-44b9-93a9-09549a617352","Type":"ContainerDied","Data":"cb55819296df8c0e569bc4b3476a5264b0562109ee9b723fb8a026076a3396c6"} Oct 01 16:51:46 crc kubenswrapper[4764]: I1001 16:51:46.264639 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxrtg" event={"ID":"c881e455-e4f2-44b9-93a9-09549a617352","Type":"ContainerStarted","Data":"1619964a184825e06863cb9f0e435b2373c1fe2a8d8734d09f0ad49ac95b67bc"} Oct 01 16:51:46 crc kubenswrapper[4764]: I1001 16:51:46.266693 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 16:51:48 crc kubenswrapper[4764]: I1001 16:51:48.291354 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxrtg" event={"ID":"c881e455-e4f2-44b9-93a9-09549a617352","Type":"ContainerStarted","Data":"8a6c413f6160d37fd9fc7c6d8e838e930d585c059c231d76dd705fd5a3a0e93b"} Oct 01 16:51:48 crc kubenswrapper[4764]: I1001 16:51:48.829463 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kfg24"] Oct 01 16:51:48 crc kubenswrapper[4764]: I1001 16:51:48.832612 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:48 crc kubenswrapper[4764]: I1001 16:51:48.848748 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kfg24"] Oct 01 16:51:48 crc kubenswrapper[4764]: I1001 16:51:48.925256 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cad1ba8-809c-4994-b19e-aadfab0c504d-utilities\") pod \"redhat-marketplace-kfg24\" (UID: \"1cad1ba8-809c-4994-b19e-aadfab0c504d\") " pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:48 crc kubenswrapper[4764]: I1001 16:51:48.925427 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwrdt\" (UniqueName: \"kubernetes.io/projected/1cad1ba8-809c-4994-b19e-aadfab0c504d-kube-api-access-cwrdt\") pod \"redhat-marketplace-kfg24\" (UID: \"1cad1ba8-809c-4994-b19e-aadfab0c504d\") " pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:48 crc kubenswrapper[4764]: I1001 16:51:48.925449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cad1ba8-809c-4994-b19e-aadfab0c504d-catalog-content\") pod \"redhat-marketplace-kfg24\" (UID: \"1cad1ba8-809c-4994-b19e-aadfab0c504d\") " pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:49 crc kubenswrapper[4764]: I1001 16:51:49.026713 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwrdt\" (UniqueName: \"kubernetes.io/projected/1cad1ba8-809c-4994-b19e-aadfab0c504d-kube-api-access-cwrdt\") pod \"redhat-marketplace-kfg24\" (UID: \"1cad1ba8-809c-4994-b19e-aadfab0c504d\") " pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:49 crc kubenswrapper[4764]: I1001 16:51:49.026756 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cad1ba8-809c-4994-b19e-aadfab0c504d-catalog-content\") pod \"redhat-marketplace-kfg24\" (UID: \"1cad1ba8-809c-4994-b19e-aadfab0c504d\") " pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:49 crc kubenswrapper[4764]: I1001 16:51:49.026822 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cad1ba8-809c-4994-b19e-aadfab0c504d-utilities\") pod \"redhat-marketplace-kfg24\" (UID: \"1cad1ba8-809c-4994-b19e-aadfab0c504d\") " pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:49 crc kubenswrapper[4764]: I1001 16:51:49.027341 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cad1ba8-809c-4994-b19e-aadfab0c504d-utilities\") pod \"redhat-marketplace-kfg24\" (UID: \"1cad1ba8-809c-4994-b19e-aadfab0c504d\") " pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:49 crc kubenswrapper[4764]: I1001 16:51:49.027590 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cad1ba8-809c-4994-b19e-aadfab0c504d-catalog-content\") pod \"redhat-marketplace-kfg24\" (UID: \"1cad1ba8-809c-4994-b19e-aadfab0c504d\") " pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:49 crc kubenswrapper[4764]: I1001 16:51:49.046980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwrdt\" (UniqueName: \"kubernetes.io/projected/1cad1ba8-809c-4994-b19e-aadfab0c504d-kube-api-access-cwrdt\") pod \"redhat-marketplace-kfg24\" (UID: \"1cad1ba8-809c-4994-b19e-aadfab0c504d\") " pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:49 crc kubenswrapper[4764]: I1001 16:51:49.153295 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:49 crc kubenswrapper[4764]: I1001 16:51:49.312170 4764 generic.go:334] "Generic (PLEG): container finished" podID="c881e455-e4f2-44b9-93a9-09549a617352" containerID="8a6c413f6160d37fd9fc7c6d8e838e930d585c059c231d76dd705fd5a3a0e93b" exitCode=0 Oct 01 16:51:49 crc kubenswrapper[4764]: I1001 16:51:49.312345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxrtg" event={"ID":"c881e455-e4f2-44b9-93a9-09549a617352","Type":"ContainerDied","Data":"8a6c413f6160d37fd9fc7c6d8e838e930d585c059c231d76dd705fd5a3a0e93b"} Oct 01 16:51:49 crc kubenswrapper[4764]: W1001 16:51:49.694562 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cad1ba8_809c_4994_b19e_aadfab0c504d.slice/crio-e5cd20116bba0c29e20d202d197f6417d9b3541911b3f3547a47ed11b793d873 WatchSource:0}: Error finding container e5cd20116bba0c29e20d202d197f6417d9b3541911b3f3547a47ed11b793d873: Status 404 returned error can't find the container with id e5cd20116bba0c29e20d202d197f6417d9b3541911b3f3547a47ed11b793d873 Oct 01 16:51:49 crc kubenswrapper[4764]: I1001 16:51:49.708616 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kfg24"] Oct 01 16:51:50 crc kubenswrapper[4764]: I1001 16:51:50.325405 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxrtg" event={"ID":"c881e455-e4f2-44b9-93a9-09549a617352","Type":"ContainerStarted","Data":"6d095ee29d28b92450724b3b4513c18b98fd3442eb4589d8b6f2e7adf8bb7b05"} Oct 01 16:51:50 crc kubenswrapper[4764]: I1001 16:51:50.329133 4764 generic.go:334] "Generic (PLEG): container finished" podID="1cad1ba8-809c-4994-b19e-aadfab0c504d" containerID="489919e5fb3e93795acb8fae3c10383fc9d560e9666ee63a2a81a5645899cedb" exitCode=0 Oct 01 16:51:50 crc kubenswrapper[4764]: I1001 16:51:50.329166 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kfg24" event={"ID":"1cad1ba8-809c-4994-b19e-aadfab0c504d","Type":"ContainerDied","Data":"489919e5fb3e93795acb8fae3c10383fc9d560e9666ee63a2a81a5645899cedb"} Oct 01 16:51:50 crc kubenswrapper[4764]: I1001 16:51:50.329196 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kfg24" event={"ID":"1cad1ba8-809c-4994-b19e-aadfab0c504d","Type":"ContainerStarted","Data":"e5cd20116bba0c29e20d202d197f6417d9b3541911b3f3547a47ed11b793d873"} Oct 01 16:51:50 crc kubenswrapper[4764]: I1001 16:51:50.348522 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zxrtg" podStartSLOduration=1.748181093 podStartE2EDuration="5.348506908s" podCreationTimestamp="2025-10-01 16:51:45 +0000 UTC" firstStartedPulling="2025-10-01 16:51:46.266392781 +0000 UTC m=+2969.266039626" lastFinishedPulling="2025-10-01 16:51:49.866718606 +0000 UTC m=+2972.866365441" observedRunningTime="2025-10-01 16:51:50.346993641 +0000 UTC m=+2973.346640476" watchObservedRunningTime="2025-10-01 16:51:50.348506908 +0000 UTC m=+2973.348153743" Oct 01 16:51:52 crc kubenswrapper[4764]: I1001 16:51:52.355752 4764 generic.go:334] "Generic (PLEG): container finished" podID="1cad1ba8-809c-4994-b19e-aadfab0c504d" containerID="be32fe77640ae0aa50e70863544f01126682407124671e7f4689fbb48ff00a84" exitCode=0 Oct 01 16:51:52 crc kubenswrapper[4764]: I1001 16:51:52.355808 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kfg24" event={"ID":"1cad1ba8-809c-4994-b19e-aadfab0c504d","Type":"ContainerDied","Data":"be32fe77640ae0aa50e70863544f01126682407124671e7f4689fbb48ff00a84"} Oct 01 16:51:53 crc kubenswrapper[4764]: I1001 16:51:53.371108 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kfg24" event={"ID":"1cad1ba8-809c-4994-b19e-aadfab0c504d","Type":"ContainerStarted","Data":"3d800e84ab03e69d834ce39403acb4f02380e030e45c8f66b229300f0e6db1ca"} Oct 01 16:51:53 crc kubenswrapper[4764]: I1001 16:51:53.394936 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kfg24" podStartSLOduration=2.898534867 podStartE2EDuration="5.394917518s" podCreationTimestamp="2025-10-01 16:51:48 +0000 UTC" firstStartedPulling="2025-10-01 16:51:50.331689434 +0000 UTC m=+2973.331336309" lastFinishedPulling="2025-10-01 16:51:52.828072115 +0000 UTC m=+2975.827718960" observedRunningTime="2025-10-01 16:51:53.394030796 +0000 UTC m=+2976.393677631" watchObservedRunningTime="2025-10-01 16:51:53.394917518 +0000 UTC m=+2976.394564363" Oct 01 16:51:55 crc kubenswrapper[4764]: I1001 16:51:55.393502 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:55 crc kubenswrapper[4764]: I1001 16:51:55.393847 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:55 crc kubenswrapper[4764]: I1001 16:51:55.445590 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:56 crc kubenswrapper[4764]: I1001 16:51:56.457912 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:57 crc kubenswrapper[4764]: I1001 16:51:57.628696 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zxrtg"] Oct 01 16:51:58 crc kubenswrapper[4764]: I1001 16:51:58.422027 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zxrtg" podUID="c881e455-e4f2-44b9-93a9-09549a617352" containerName="registry-server" containerID="cri-o://6d095ee29d28b92450724b3b4513c18b98fd3442eb4589d8b6f2e7adf8bb7b05" gracePeriod=2 Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.154123 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.154924 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.217588 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.359817 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.432372 4764 generic.go:334] "Generic (PLEG): container finished" podID="c881e455-e4f2-44b9-93a9-09549a617352" containerID="6d095ee29d28b92450724b3b4513c18b98fd3442eb4589d8b6f2e7adf8bb7b05" exitCode=0 Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.432470 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxrtg" event={"ID":"c881e455-e4f2-44b9-93a9-09549a617352","Type":"ContainerDied","Data":"6d095ee29d28b92450724b3b4513c18b98fd3442eb4589d8b6f2e7adf8bb7b05"} Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.432516 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zxrtg" event={"ID":"c881e455-e4f2-44b9-93a9-09549a617352","Type":"ContainerDied","Data":"1619964a184825e06863cb9f0e435b2373c1fe2a8d8734d09f0ad49ac95b67bc"} Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.432537 4764 scope.go:117] "RemoveContainer" containerID="6d095ee29d28b92450724b3b4513c18b98fd3442eb4589d8b6f2e7adf8bb7b05" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.432822 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zxrtg" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.450093 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcsmg\" (UniqueName: \"kubernetes.io/projected/c881e455-e4f2-44b9-93a9-09549a617352-kube-api-access-jcsmg\") pod \"c881e455-e4f2-44b9-93a9-09549a617352\" (UID: \"c881e455-e4f2-44b9-93a9-09549a617352\") " Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.450358 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c881e455-e4f2-44b9-93a9-09549a617352-utilities\") pod \"c881e455-e4f2-44b9-93a9-09549a617352\" (UID: \"c881e455-e4f2-44b9-93a9-09549a617352\") " Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.450538 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c881e455-e4f2-44b9-93a9-09549a617352-catalog-content\") pod \"c881e455-e4f2-44b9-93a9-09549a617352\" (UID: \"c881e455-e4f2-44b9-93a9-09549a617352\") " Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.451414 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c881e455-e4f2-44b9-93a9-09549a617352-utilities" (OuterVolumeSpecName: "utilities") pod "c881e455-e4f2-44b9-93a9-09549a617352" (UID: "c881e455-e4f2-44b9-93a9-09549a617352"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.454258 4764 scope.go:117] "RemoveContainer" containerID="8a6c413f6160d37fd9fc7c6d8e838e930d585c059c231d76dd705fd5a3a0e93b" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.456280 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c881e455-e4f2-44b9-93a9-09549a617352-kube-api-access-jcsmg" (OuterVolumeSpecName: "kube-api-access-jcsmg") pod "c881e455-e4f2-44b9-93a9-09549a617352" (UID: "c881e455-e4f2-44b9-93a9-09549a617352"). InnerVolumeSpecName "kube-api-access-jcsmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.483959 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.506010 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c881e455-e4f2-44b9-93a9-09549a617352-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c881e455-e4f2-44b9-93a9-09549a617352" (UID: "c881e455-e4f2-44b9-93a9-09549a617352"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.528028 4764 scope.go:117] "RemoveContainer" containerID="cb55819296df8c0e569bc4b3476a5264b0562109ee9b723fb8a026076a3396c6" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.552473 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcsmg\" (UniqueName: \"kubernetes.io/projected/c881e455-e4f2-44b9-93a9-09549a617352-kube-api-access-jcsmg\") on node \"crc\" DevicePath \"\"" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.552503 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c881e455-e4f2-44b9-93a9-09549a617352-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.552512 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c881e455-e4f2-44b9-93a9-09549a617352-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.563619 4764 scope.go:117] "RemoveContainer" containerID="6d095ee29d28b92450724b3b4513c18b98fd3442eb4589d8b6f2e7adf8bb7b05" Oct 01 16:51:59 crc kubenswrapper[4764]: E1001 16:51:59.564788 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d095ee29d28b92450724b3b4513c18b98fd3442eb4589d8b6f2e7adf8bb7b05\": container with ID starting with 6d095ee29d28b92450724b3b4513c18b98fd3442eb4589d8b6f2e7adf8bb7b05 not found: ID does not exist" containerID="6d095ee29d28b92450724b3b4513c18b98fd3442eb4589d8b6f2e7adf8bb7b05" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.564952 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d095ee29d28b92450724b3b4513c18b98fd3442eb4589d8b6f2e7adf8bb7b05"} err="failed to get container status \"6d095ee29d28b92450724b3b4513c18b98fd3442eb4589d8b6f2e7adf8bb7b05\": rpc error: code = NotFound desc = could not find container \"6d095ee29d28b92450724b3b4513c18b98fd3442eb4589d8b6f2e7adf8bb7b05\": container with ID starting with 6d095ee29d28b92450724b3b4513c18b98fd3442eb4589d8b6f2e7adf8bb7b05 not found: ID does not exist" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.565126 4764 scope.go:117] "RemoveContainer" containerID="8a6c413f6160d37fd9fc7c6d8e838e930d585c059c231d76dd705fd5a3a0e93b" Oct 01 16:51:59 crc kubenswrapper[4764]: E1001 16:51:59.565685 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a6c413f6160d37fd9fc7c6d8e838e930d585c059c231d76dd705fd5a3a0e93b\": container with ID starting with 8a6c413f6160d37fd9fc7c6d8e838e930d585c059c231d76dd705fd5a3a0e93b not found: ID does not exist" containerID="8a6c413f6160d37fd9fc7c6d8e838e930d585c059c231d76dd705fd5a3a0e93b" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.565721 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a6c413f6160d37fd9fc7c6d8e838e930d585c059c231d76dd705fd5a3a0e93b"} err="failed to get container status \"8a6c413f6160d37fd9fc7c6d8e838e930d585c059c231d76dd705fd5a3a0e93b\": rpc error: code = NotFound desc = could not find container \"8a6c413f6160d37fd9fc7c6d8e838e930d585c059c231d76dd705fd5a3a0e93b\": container with ID starting with 8a6c413f6160d37fd9fc7c6d8e838e930d585c059c231d76dd705fd5a3a0e93b not found: ID does not exist" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.565748 4764 scope.go:117] "RemoveContainer" containerID="cb55819296df8c0e569bc4b3476a5264b0562109ee9b723fb8a026076a3396c6" Oct 01 16:51:59 crc kubenswrapper[4764]: E1001 16:51:59.566110 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb55819296df8c0e569bc4b3476a5264b0562109ee9b723fb8a026076a3396c6\": container with ID starting with cb55819296df8c0e569bc4b3476a5264b0562109ee9b723fb8a026076a3396c6 not found: ID does not exist" containerID="cb55819296df8c0e569bc4b3476a5264b0562109ee9b723fb8a026076a3396c6" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.566136 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb55819296df8c0e569bc4b3476a5264b0562109ee9b723fb8a026076a3396c6"} err="failed to get container status \"cb55819296df8c0e569bc4b3476a5264b0562109ee9b723fb8a026076a3396c6\": rpc error: code = NotFound desc = could not find container \"cb55819296df8c0e569bc4b3476a5264b0562109ee9b723fb8a026076a3396c6\": container with ID starting with cb55819296df8c0e569bc4b3476a5264b0562109ee9b723fb8a026076a3396c6 not found: ID does not exist" Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.770621 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zxrtg"] Oct 01 16:51:59 crc kubenswrapper[4764]: I1001 16:51:59.778917 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zxrtg"] Oct 01 16:52:01 crc kubenswrapper[4764]: I1001 16:52:01.624392 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kfg24"] Oct 01 16:52:01 crc kubenswrapper[4764]: I1001 16:52:01.733706 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c881e455-e4f2-44b9-93a9-09549a617352" path="/var/lib/kubelet/pods/c881e455-e4f2-44b9-93a9-09549a617352/volumes" Oct 01 16:52:02 crc kubenswrapper[4764]: I1001 16:52:02.463510 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kfg24" podUID="1cad1ba8-809c-4994-b19e-aadfab0c504d" containerName="registry-server" containerID="cri-o://3d800e84ab03e69d834ce39403acb4f02380e030e45c8f66b229300f0e6db1ca" gracePeriod=2 Oct 01 16:52:02 crc kubenswrapper[4764]: I1001 16:52:02.955478 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.024923 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwrdt\" (UniqueName: \"kubernetes.io/projected/1cad1ba8-809c-4994-b19e-aadfab0c504d-kube-api-access-cwrdt\") pod \"1cad1ba8-809c-4994-b19e-aadfab0c504d\" (UID: \"1cad1ba8-809c-4994-b19e-aadfab0c504d\") " Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.025113 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cad1ba8-809c-4994-b19e-aadfab0c504d-utilities\") pod \"1cad1ba8-809c-4994-b19e-aadfab0c504d\" (UID: \"1cad1ba8-809c-4994-b19e-aadfab0c504d\") " Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.025142 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cad1ba8-809c-4994-b19e-aadfab0c504d-catalog-content\") pod \"1cad1ba8-809c-4994-b19e-aadfab0c504d\" (UID: \"1cad1ba8-809c-4994-b19e-aadfab0c504d\") " Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.025993 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cad1ba8-809c-4994-b19e-aadfab0c504d-utilities" (OuterVolumeSpecName: "utilities") pod "1cad1ba8-809c-4994-b19e-aadfab0c504d" (UID: "1cad1ba8-809c-4994-b19e-aadfab0c504d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.030241 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cad1ba8-809c-4994-b19e-aadfab0c504d-kube-api-access-cwrdt" (OuterVolumeSpecName: "kube-api-access-cwrdt") pod "1cad1ba8-809c-4994-b19e-aadfab0c504d" (UID: "1cad1ba8-809c-4994-b19e-aadfab0c504d"). InnerVolumeSpecName "kube-api-access-cwrdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.037812 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cad1ba8-809c-4994-b19e-aadfab0c504d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cad1ba8-809c-4994-b19e-aadfab0c504d" (UID: "1cad1ba8-809c-4994-b19e-aadfab0c504d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.127451 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwrdt\" (UniqueName: \"kubernetes.io/projected/1cad1ba8-809c-4994-b19e-aadfab0c504d-kube-api-access-cwrdt\") on node \"crc\" DevicePath \"\"" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.127484 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cad1ba8-809c-4994-b19e-aadfab0c504d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.127496 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cad1ba8-809c-4994-b19e-aadfab0c504d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.476353 4764 generic.go:334] "Generic (PLEG): container finished" podID="1cad1ba8-809c-4994-b19e-aadfab0c504d" containerID="3d800e84ab03e69d834ce39403acb4f02380e030e45c8f66b229300f0e6db1ca" exitCode=0 Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.476453 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kfg24" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.476458 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kfg24" event={"ID":"1cad1ba8-809c-4994-b19e-aadfab0c504d","Type":"ContainerDied","Data":"3d800e84ab03e69d834ce39403acb4f02380e030e45c8f66b229300f0e6db1ca"} Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.476846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kfg24" event={"ID":"1cad1ba8-809c-4994-b19e-aadfab0c504d","Type":"ContainerDied","Data":"e5cd20116bba0c29e20d202d197f6417d9b3541911b3f3547a47ed11b793d873"} Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.476866 4764 scope.go:117] "RemoveContainer" containerID="3d800e84ab03e69d834ce39403acb4f02380e030e45c8f66b229300f0e6db1ca" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.499253 4764 scope.go:117] "RemoveContainer" containerID="be32fe77640ae0aa50e70863544f01126682407124671e7f4689fbb48ff00a84" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.544024 4764 scope.go:117] "RemoveContainer" containerID="489919e5fb3e93795acb8fae3c10383fc9d560e9666ee63a2a81a5645899cedb" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.544207 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kfg24"] Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.557981 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kfg24"] Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.578188 4764 scope.go:117] "RemoveContainer" containerID="3d800e84ab03e69d834ce39403acb4f02380e030e45c8f66b229300f0e6db1ca" Oct 01 16:52:03 crc kubenswrapper[4764]: E1001 16:52:03.578648 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d800e84ab03e69d834ce39403acb4f02380e030e45c8f66b229300f0e6db1ca\": container with ID starting with 3d800e84ab03e69d834ce39403acb4f02380e030e45c8f66b229300f0e6db1ca not found: ID does not exist" containerID="3d800e84ab03e69d834ce39403acb4f02380e030e45c8f66b229300f0e6db1ca" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.578703 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d800e84ab03e69d834ce39403acb4f02380e030e45c8f66b229300f0e6db1ca"} err="failed to get container status \"3d800e84ab03e69d834ce39403acb4f02380e030e45c8f66b229300f0e6db1ca\": rpc error: code = NotFound desc = could not find container \"3d800e84ab03e69d834ce39403acb4f02380e030e45c8f66b229300f0e6db1ca\": container with ID starting with 3d800e84ab03e69d834ce39403acb4f02380e030e45c8f66b229300f0e6db1ca not found: ID does not exist" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.578741 4764 scope.go:117] "RemoveContainer" containerID="be32fe77640ae0aa50e70863544f01126682407124671e7f4689fbb48ff00a84" Oct 01 16:52:03 crc kubenswrapper[4764]: E1001 16:52:03.579111 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be32fe77640ae0aa50e70863544f01126682407124671e7f4689fbb48ff00a84\": container with ID starting with be32fe77640ae0aa50e70863544f01126682407124671e7f4689fbb48ff00a84 not found: ID does not exist" containerID="be32fe77640ae0aa50e70863544f01126682407124671e7f4689fbb48ff00a84" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.579154 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be32fe77640ae0aa50e70863544f01126682407124671e7f4689fbb48ff00a84"} err="failed to get container status \"be32fe77640ae0aa50e70863544f01126682407124671e7f4689fbb48ff00a84\": rpc error: code = NotFound desc = could not find container \"be32fe77640ae0aa50e70863544f01126682407124671e7f4689fbb48ff00a84\": container with ID starting with be32fe77640ae0aa50e70863544f01126682407124671e7f4689fbb48ff00a84 not found: ID does not exist" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.579182 4764 scope.go:117] "RemoveContainer" containerID="489919e5fb3e93795acb8fae3c10383fc9d560e9666ee63a2a81a5645899cedb" Oct 01 16:52:03 crc kubenswrapper[4764]: E1001 16:52:03.579528 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"489919e5fb3e93795acb8fae3c10383fc9d560e9666ee63a2a81a5645899cedb\": container with ID starting with 489919e5fb3e93795acb8fae3c10383fc9d560e9666ee63a2a81a5645899cedb not found: ID does not exist" containerID="489919e5fb3e93795acb8fae3c10383fc9d560e9666ee63a2a81a5645899cedb" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.579558 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"489919e5fb3e93795acb8fae3c10383fc9d560e9666ee63a2a81a5645899cedb"} err="failed to get container status \"489919e5fb3e93795acb8fae3c10383fc9d560e9666ee63a2a81a5645899cedb\": rpc error: code = NotFound desc = could not find container \"489919e5fb3e93795acb8fae3c10383fc9d560e9666ee63a2a81a5645899cedb\": container with ID starting with 489919e5fb3e93795acb8fae3c10383fc9d560e9666ee63a2a81a5645899cedb not found: ID does not exist" Oct 01 16:52:03 crc kubenswrapper[4764]: I1001 16:52:03.734609 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cad1ba8-809c-4994-b19e-aadfab0c504d" path="/var/lib/kubelet/pods/1cad1ba8-809c-4994-b19e-aadfab0c504d/volumes" Oct 01 16:53:21 crc kubenswrapper[4764]: I1001 16:53:21.914289 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:53:21 crc kubenswrapper[4764]: I1001 16:53:21.915498 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:53:51 crc kubenswrapper[4764]: I1001 16:53:51.914079 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:53:51 crc kubenswrapper[4764]: I1001 16:53:51.915167 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:54:11 crc kubenswrapper[4764]: I1001 16:54:11.898854 4764 generic.go:334] "Generic (PLEG): container finished" podID="473bdd59-1196-45be-931d-f452ce6bc2fa" containerID="8ccb4bf15aee238c259a33a49be02d51f2ef253aeecd5ca144814d24db980cf1" exitCode=0 Oct 01 16:54:11 crc kubenswrapper[4764]: I1001 16:54:11.899566 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" event={"ID":"473bdd59-1196-45be-931d-f452ce6bc2fa","Type":"ContainerDied","Data":"8ccb4bf15aee238c259a33a49be02d51f2ef253aeecd5ca144814d24db980cf1"} Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.424314 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.444682 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-inventory\") pod \"473bdd59-1196-45be-931d-f452ce6bc2fa\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.445796 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-ceph\") pod \"473bdd59-1196-45be-931d-f452ce6bc2fa\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.445839 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-cell1-compute-config-1\") pod \"473bdd59-1196-45be-931d-f452ce6bc2fa\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.445966 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/473bdd59-1196-45be-931d-f452ce6bc2fa-ceph-nova-0\") pod \"473bdd59-1196-45be-931d-f452ce6bc2fa\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.446000 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-custom-ceph-combined-ca-bundle\") pod \"473bdd59-1196-45be-931d-f452ce6bc2fa\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.446037 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-migration-ssh-key-1\") pod \"473bdd59-1196-45be-931d-f452ce6bc2fa\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.446107 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnvg5\" (UniqueName: \"kubernetes.io/projected/473bdd59-1196-45be-931d-f452ce6bc2fa-kube-api-access-jnvg5\") pod \"473bdd59-1196-45be-931d-f452ce6bc2fa\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.446151 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-extra-config-0\") pod \"473bdd59-1196-45be-931d-f452ce6bc2fa\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.446185 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-ssh-key\") pod \"473bdd59-1196-45be-931d-f452ce6bc2fa\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.446234 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-cell1-compute-config-0\") pod \"473bdd59-1196-45be-931d-f452ce6bc2fa\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.446312 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-migration-ssh-key-0\") pod \"473bdd59-1196-45be-931d-f452ce6bc2fa\" (UID: \"473bdd59-1196-45be-931d-f452ce6bc2fa\") " Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.453313 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "473bdd59-1196-45be-931d-f452ce6bc2fa" (UID: "473bdd59-1196-45be-931d-f452ce6bc2fa"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.453476 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-ceph" (OuterVolumeSpecName: "ceph") pod "473bdd59-1196-45be-931d-f452ce6bc2fa" (UID: "473bdd59-1196-45be-931d-f452ce6bc2fa"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.482434 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473bdd59-1196-45be-931d-f452ce6bc2fa-kube-api-access-jnvg5" (OuterVolumeSpecName: "kube-api-access-jnvg5") pod "473bdd59-1196-45be-931d-f452ce6bc2fa" (UID: "473bdd59-1196-45be-931d-f452ce6bc2fa"). InnerVolumeSpecName "kube-api-access-jnvg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.502535 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "473bdd59-1196-45be-931d-f452ce6bc2fa" (UID: "473bdd59-1196-45be-931d-f452ce6bc2fa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.503159 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-inventory" (OuterVolumeSpecName: "inventory") pod "473bdd59-1196-45be-931d-f452ce6bc2fa" (UID: "473bdd59-1196-45be-931d-f452ce6bc2fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.505723 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "473bdd59-1196-45be-931d-f452ce6bc2fa" (UID: "473bdd59-1196-45be-931d-f452ce6bc2fa"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.522065 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "473bdd59-1196-45be-931d-f452ce6bc2fa" (UID: "473bdd59-1196-45be-931d-f452ce6bc2fa"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.525023 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "473bdd59-1196-45be-931d-f452ce6bc2fa" (UID: "473bdd59-1196-45be-931d-f452ce6bc2fa"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.525805 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "473bdd59-1196-45be-931d-f452ce6bc2fa" (UID: "473bdd59-1196-45be-931d-f452ce6bc2fa"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.528894 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/473bdd59-1196-45be-931d-f452ce6bc2fa-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "473bdd59-1196-45be-931d-f452ce6bc2fa" (UID: "473bdd59-1196-45be-931d-f452ce6bc2fa"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.532248 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "473bdd59-1196-45be-931d-f452ce6bc2fa" (UID: "473bdd59-1196-45be-931d-f452ce6bc2fa"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.548381 4764 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.548509 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.548524 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.548538 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.548551 4764 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/473bdd59-1196-45be-931d-f452ce6bc2fa-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.548564 4764 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.548576 4764 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.548588 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnvg5\" (UniqueName: \"kubernetes.io/projected/473bdd59-1196-45be-931d-f452ce6bc2fa-kube-api-access-jnvg5\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.548600 4764 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.548610 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.548621 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/473bdd59-1196-45be-931d-f452ce6bc2fa-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.927658 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" event={"ID":"473bdd59-1196-45be-931d-f452ce6bc2fa","Type":"ContainerDied","Data":"14175e17f050baae55816d7be30bcf2ef01673f2b6e373fff25cbb03c6332041"} Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.927702 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14175e17f050baae55816d7be30bcf2ef01673f2b6e373fff25cbb03c6332041" Oct 01 16:54:13 crc kubenswrapper[4764]: I1001 16:54:13.927755 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh" Oct 01 16:54:21 crc kubenswrapper[4764]: I1001 16:54:21.914267 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:54:21 crc kubenswrapper[4764]: I1001 16:54:21.914988 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:54:21 crc kubenswrapper[4764]: I1001 16:54:21.915034 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 16:54:21 crc kubenswrapper[4764]: I1001 16:54:21.915895 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:54:21 crc kubenswrapper[4764]: I1001 16:54:21.915954 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" gracePeriod=600 Oct 01 16:54:22 crc kubenswrapper[4764]: E1001 16:54:22.042761 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:54:23 crc kubenswrapper[4764]: I1001 16:54:23.021489 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" exitCode=0 Oct 01 16:54:23 crc kubenswrapper[4764]: I1001 16:54:23.021585 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a"} Oct 01 16:54:23 crc kubenswrapper[4764]: I1001 16:54:23.022007 4764 scope.go:117] "RemoveContainer" containerID="9d36306da14f1c645b09c7207dbb3821f0a50fdb34b3cfb5a0ef9a2b606f993d" Oct 01 16:54:23 crc kubenswrapper[4764]: I1001 16:54:23.023000 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:54:23 crc kubenswrapper[4764]: E1001 16:54:23.023342 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.177410 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 01 16:54:28 crc kubenswrapper[4764]: E1001 16:54:28.178307 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cad1ba8-809c-4994-b19e-aadfab0c504d" containerName="extract-content" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.178320 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cad1ba8-809c-4994-b19e-aadfab0c504d" containerName="extract-content" Oct 01 16:54:28 crc kubenswrapper[4764]: E1001 16:54:28.178341 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cad1ba8-809c-4994-b19e-aadfab0c504d" containerName="extract-utilities" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.178347 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cad1ba8-809c-4994-b19e-aadfab0c504d" containerName="extract-utilities" Oct 01 16:54:28 crc kubenswrapper[4764]: E1001 16:54:28.178356 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c881e455-e4f2-44b9-93a9-09549a617352" containerName="registry-server" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.178362 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c881e455-e4f2-44b9-93a9-09549a617352" containerName="registry-server" Oct 01 16:54:28 crc kubenswrapper[4764]: E1001 16:54:28.178373 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cad1ba8-809c-4994-b19e-aadfab0c504d" containerName="registry-server" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.178379 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cad1ba8-809c-4994-b19e-aadfab0c504d" containerName="registry-server" Oct 01 16:54:28 crc kubenswrapper[4764]: E1001 16:54:28.178388 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473bdd59-1196-45be-931d-f452ce6bc2fa" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.178394 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="473bdd59-1196-45be-931d-f452ce6bc2fa" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 01 16:54:28 crc kubenswrapper[4764]: E1001 16:54:28.178410 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c881e455-e4f2-44b9-93a9-09549a617352" containerName="extract-utilities" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.178416 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c881e455-e4f2-44b9-93a9-09549a617352" containerName="extract-utilities" Oct 01 16:54:28 crc kubenswrapper[4764]: E1001 16:54:28.178426 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c881e455-e4f2-44b9-93a9-09549a617352" containerName="extract-content" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.178431 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c881e455-e4f2-44b9-93a9-09549a617352" containerName="extract-content" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.178598 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="473bdd59-1196-45be-931d-f452ce6bc2fa" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.178613 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cad1ba8-809c-4994-b19e-aadfab0c504d" containerName="registry-server" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.178625 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c881e455-e4f2-44b9-93a9-09549a617352" containerName="registry-server" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.183083 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.191652 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.192030 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.198643 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.214761 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.216397 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.218625 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237739 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237779 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4b10553-8ea5-49bf-96cf-c22620f1ced3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237795 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237810 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237825 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-dev\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237839 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-ceph\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237856 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237874 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237901 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237925 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-sys\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237947 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237965 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b10553-8ea5-49bf-96cf-c22620f1ced3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237981 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.237997 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238013 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f4b10553-8ea5-49bf-96cf-c22620f1ced3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238032 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238095 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97j6f\" (UniqueName: \"kubernetes.io/projected/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-kube-api-access-97j6f\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238110 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238134 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-config-data\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238149 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238166 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-run\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238200 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-scripts\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238215 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-lib-modules\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238234 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-run\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238252 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238270 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238283 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238303 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4b10553-8ea5-49bf-96cf-c22620f1ced3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238324 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw85m\" (UniqueName: \"kubernetes.io/projected/f4b10553-8ea5-49bf-96cf-c22620f1ced3-kube-api-access-mw85m\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238338 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238355 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4b10553-8ea5-49bf-96cf-c22620f1ced3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.238372 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.249570 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340210 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-run\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340275 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-scripts\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340296 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-lib-modules\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340325 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-run\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340371 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340368 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-run\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340397 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340401 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-run\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340412 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340411 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-lib-modules\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340475 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4b10553-8ea5-49bf-96cf-c22620f1ced3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340574 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340622 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw85m\" (UniqueName: \"kubernetes.io/projected/f4b10553-8ea5-49bf-96cf-c22620f1ced3-kube-api-access-mw85m\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340645 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340662 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340698 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4b10553-8ea5-49bf-96cf-c22620f1ced3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340744 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340819 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340851 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4b10553-8ea5-49bf-96cf-c22620f1ced3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340897 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340931 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-dev\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340965 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.340985 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-ceph\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341030 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341092 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341100 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341126 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-sys\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341160 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341184 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b10553-8ea5-49bf-96cf-c22620f1ced3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341201 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341228 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341242 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f4b10553-8ea5-49bf-96cf-c22620f1ced3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341263 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341289 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97j6f\" (UniqueName: \"kubernetes.io/projected/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-kube-api-access-97j6f\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341305 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341353 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-config-data\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341378 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341475 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341727 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341743 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341796 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341816 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-sys\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341819 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341857 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.341858 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-dev\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.343123 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.343191 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.349751 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.350076 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f4b10553-8ea5-49bf-96cf-c22620f1ced3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.350333 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.351339 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4b10553-8ea5-49bf-96cf-c22620f1ced3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.351990 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4b10553-8ea5-49bf-96cf-c22620f1ced3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.353436 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4b10553-8ea5-49bf-96cf-c22620f1ced3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.353873 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-scripts\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.356568 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4b10553-8ea5-49bf-96cf-c22620f1ced3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.358302 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f4b10553-8ea5-49bf-96cf-c22620f1ced3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.360508 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-ceph\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.361193 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw85m\" (UniqueName: \"kubernetes.io/projected/f4b10553-8ea5-49bf-96cf-c22620f1ced3-kube-api-access-mw85m\") pod \"cinder-volume-volume1-0\" (UID: \"f4b10553-8ea5-49bf-96cf-c22620f1ced3\") " pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.362704 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97j6f\" (UniqueName: \"kubernetes.io/projected/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-kube-api-access-97j6f\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.365324 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.366688 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b47ff15-96b3-49ac-a5e5-1ce1051d53a0-config-data\") pod \"cinder-backup-0\" (UID: \"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0\") " pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.537670 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.551097 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.755635 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-xzsgk"] Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.757256 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-xzsgk" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.766971 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-xzsgk"] Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.864107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9hd8\" (UniqueName: \"kubernetes.io/projected/372fcf2a-4cac-4506-b875-aada21327f29-kube-api-access-d9hd8\") pod \"manila-db-create-xzsgk\" (UID: \"372fcf2a-4cac-4506-b875-aada21327f29\") " pod="openstack/manila-db-create-xzsgk" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.899154 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-65467fc489-ql7ls"] Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.912381 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.914167 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-cgdh5" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.917613 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.917727 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.917902 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.967985 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/310ac18b-b8b2-4ec2-8c8f-40484023c08d-config-data\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.968243 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/310ac18b-b8b2-4ec2-8c8f-40484023c08d-scripts\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.968325 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/310ac18b-b8b2-4ec2-8c8f-40484023c08d-logs\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.968474 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9hd8\" (UniqueName: \"kubernetes.io/projected/372fcf2a-4cac-4506-b875-aada21327f29-kube-api-access-d9hd8\") pod \"manila-db-create-xzsgk\" (UID: \"372fcf2a-4cac-4506-b875-aada21327f29\") " pod="openstack/manila-db-create-xzsgk" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.968558 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkdgc\" (UniqueName: \"kubernetes.io/projected/310ac18b-b8b2-4ec2-8c8f-40484023c08d-kube-api-access-fkdgc\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.968661 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/310ac18b-b8b2-4ec2-8c8f-40484023c08d-horizon-secret-key\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:28 crc kubenswrapper[4764]: I1001 16:54:28.988211 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65467fc489-ql7ls"] Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:28.999124 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.005956 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.013869 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9hd8\" (UniqueName: \"kubernetes.io/projected/372fcf2a-4cac-4506-b875-aada21327f29-kube-api-access-d9hd8\") pod \"manila-db-create-xzsgk\" (UID: \"372fcf2a-4cac-4506-b875-aada21327f29\") " pod="openstack/manila-db-create-xzsgk" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.014782 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-s7qgl" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.014888 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.015133 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.022442 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.035916 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.053084 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.054504 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.060366 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.060528 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.063651 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5dd67b586f-m6wgn"] Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.067300 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.070895 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071102 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/310ac18b-b8b2-4ec2-8c8f-40484023c08d-config-data\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071155 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-scripts\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071194 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/310ac18b-b8b2-4ec2-8c8f-40484023c08d-scripts\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071226 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/310ac18b-b8b2-4ec2-8c8f-40484023c08d-logs\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071257 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071327 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4f1ce31-53e8-439c-904b-aff4527ab132-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071410 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkdgc\" (UniqueName: \"kubernetes.io/projected/310ac18b-b8b2-4ec2-8c8f-40484023c08d-kube-api-access-fkdgc\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071451 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071478 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4f1ce31-53e8-439c-904b-aff4527ab132-logs\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071513 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071541 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/310ac18b-b8b2-4ec2-8c8f-40484023c08d-horizon-secret-key\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071570 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f4f1ce31-53e8-439c-904b-aff4527ab132-ceph\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071600 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-config-data\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.071648 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdncz\" (UniqueName: \"kubernetes.io/projected/f4f1ce31-53e8-439c-904b-aff4527ab132-kube-api-access-pdncz\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.072396 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/310ac18b-b8b2-4ec2-8c8f-40484023c08d-config-data\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.072443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/310ac18b-b8b2-4ec2-8c8f-40484023c08d-scripts\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.072724 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/310ac18b-b8b2-4ec2-8c8f-40484023c08d-logs\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.081250 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:29 crc kubenswrapper[4764]: E1001 16:54:29.082129 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceph combined-ca-bundle config-data glance httpd-run kube-api-access-pdncz logs public-tls-certs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="f4f1ce31-53e8-439c-904b-aff4527ab132" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.086662 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/310ac18b-b8b2-4ec2-8c8f-40484023c08d-horizon-secret-key\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.090605 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dd67b586f-m6wgn"] Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.090733 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-xzsgk" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.093492 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkdgc\" (UniqueName: \"kubernetes.io/projected/310ac18b-b8b2-4ec2-8c8f-40484023c08d-kube-api-access-fkdgc\") pod \"horizon-65467fc489-ql7ls\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.110887 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.124944 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.164368 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.174154 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.174240 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.174315 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4f1ce31-53e8-439c-904b-aff4527ab132-logs\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.174380 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.174446 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f4f1ce31-53e8-439c-904b-aff4527ab132-ceph\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.174488 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-config-data\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.174532 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.174784 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4f1ce31-53e8-439c-904b-aff4527ab132-logs\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.175025 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.175102 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdncz\" (UniqueName: \"kubernetes.io/projected/f4f1ce31-53e8-439c-904b-aff4527ab132-kube-api-access-pdncz\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.175192 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.175234 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-scripts\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.175264 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.175305 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.175338 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.175359 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.175384 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcjl5\" (UniqueName: \"kubernetes.io/projected/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-kube-api-access-gcjl5\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.175406 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.175440 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.175481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4f1ce31-53e8-439c-904b-aff4527ab132-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.175867 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4f1ce31-53e8-439c-904b-aff4527ab132-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.177902 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.182655 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f4f1ce31-53e8-439c-904b-aff4527ab132-ceph\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.183015 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.183250 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-scripts\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.183321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-config-data\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.197418 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdncz\" (UniqueName: \"kubernetes.io/projected/f4f1ce31-53e8-439c-904b-aff4527ab132-kube-api-access-pdncz\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.204713 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.259154 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.276959 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-public-tls-certs\") pod \"f4f1ce31-53e8-439c-904b-aff4527ab132\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.277340 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4f1ce31-53e8-439c-904b-aff4527ab132-httpd-run\") pod \"f4f1ce31-53e8-439c-904b-aff4527ab132\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.277467 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4f1ce31-53e8-439c-904b-aff4527ab132-logs\") pod \"f4f1ce31-53e8-439c-904b-aff4527ab132\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.277690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.277741 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.278145 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6950085a-b11f-459f-b07b-b53af7a40255-horizon-secret-key\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.278182 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.278210 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.278234 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.278257 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcjl5\" (UniqueName: \"kubernetes.io/projected/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-kube-api-access-gcjl5\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.278271 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.278290 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2c29\" (UniqueName: \"kubernetes.io/projected/6950085a-b11f-459f-b07b-b53af7a40255-kube-api-access-h2c29\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.278315 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.278333 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6950085a-b11f-459f-b07b-b53af7a40255-logs\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.278415 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.278433 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6950085a-b11f-459f-b07b-b53af7a40255-scripts\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.278486 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6950085a-b11f-459f-b07b-b53af7a40255-config-data\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.280449 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f1ce31-53e8-439c-904b-aff4527ab132-logs" (OuterVolumeSpecName: "logs") pod "f4f1ce31-53e8-439c-904b-aff4527ab132" (UID: "f4f1ce31-53e8-439c-904b-aff4527ab132"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.280511 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.280979 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f1ce31-53e8-439c-904b-aff4527ab132-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f4f1ce31-53e8-439c-904b-aff4527ab132" (UID: "f4f1ce31-53e8-439c-904b-aff4527ab132"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.281502 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f4f1ce31-53e8-439c-904b-aff4527ab132" (UID: "f4f1ce31-53e8-439c-904b-aff4527ab132"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.282164 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.282836 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.283853 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.285320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.285881 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.286467 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.293324 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-ceph\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.300016 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcjl5\" (UniqueName: \"kubernetes.io/projected/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-kube-api-access-gcjl5\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.327440 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.334862 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.356777 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.379427 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdncz\" (UniqueName: \"kubernetes.io/projected/f4f1ce31-53e8-439c-904b-aff4527ab132-kube-api-access-pdncz\") pod \"f4f1ce31-53e8-439c-904b-aff4527ab132\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.379483 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-combined-ca-bundle\") pod \"f4f1ce31-53e8-439c-904b-aff4527ab132\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.379579 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-config-data\") pod \"f4f1ce31-53e8-439c-904b-aff4527ab132\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.379610 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-scripts\") pod \"f4f1ce31-53e8-439c-904b-aff4527ab132\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.379721 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"f4f1ce31-53e8-439c-904b-aff4527ab132\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.379752 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f4f1ce31-53e8-439c-904b-aff4527ab132-ceph\") pod \"f4f1ce31-53e8-439c-904b-aff4527ab132\" (UID: \"f4f1ce31-53e8-439c-904b-aff4527ab132\") " Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.380027 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2c29\" (UniqueName: \"kubernetes.io/projected/6950085a-b11f-459f-b07b-b53af7a40255-kube-api-access-h2c29\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.380086 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6950085a-b11f-459f-b07b-b53af7a40255-logs\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.380153 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6950085a-b11f-459f-b07b-b53af7a40255-scripts\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.380208 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6950085a-b11f-459f-b07b-b53af7a40255-config-data\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.380329 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6950085a-b11f-459f-b07b-b53af7a40255-horizon-secret-key\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.380419 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f4f1ce31-53e8-439c-904b-aff4527ab132-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.380432 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4f1ce31-53e8-439c-904b-aff4527ab132-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.380443 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.385216 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6950085a-b11f-459f-b07b-b53af7a40255-scripts\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.385921 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6950085a-b11f-459f-b07b-b53af7a40255-config-data\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.388865 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6950085a-b11f-459f-b07b-b53af7a40255-logs\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.390673 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "f4f1ce31-53e8-439c-904b-aff4527ab132" (UID: "f4f1ce31-53e8-439c-904b-aff4527ab132"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.390749 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f1ce31-53e8-439c-904b-aff4527ab132-kube-api-access-pdncz" (OuterVolumeSpecName: "kube-api-access-pdncz") pod "f4f1ce31-53e8-439c-904b-aff4527ab132" (UID: "f4f1ce31-53e8-439c-904b-aff4527ab132"). InnerVolumeSpecName "kube-api-access-pdncz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.392645 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4f1ce31-53e8-439c-904b-aff4527ab132" (UID: "f4f1ce31-53e8-439c-904b-aff4527ab132"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.403581 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6950085a-b11f-459f-b07b-b53af7a40255-horizon-secret-key\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.405336 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2c29\" (UniqueName: \"kubernetes.io/projected/6950085a-b11f-459f-b07b-b53af7a40255-kube-api-access-h2c29\") pod \"horizon-5dd67b586f-m6wgn\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.410031 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f1ce31-53e8-439c-904b-aff4527ab132-ceph" (OuterVolumeSpecName: "ceph") pod "f4f1ce31-53e8-439c-904b-aff4527ab132" (UID: "f4f1ce31-53e8-439c-904b-aff4527ab132"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.411736 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-config-data" (OuterVolumeSpecName: "config-data") pod "f4f1ce31-53e8-439c-904b-aff4527ab132" (UID: "f4f1ce31-53e8-439c-904b-aff4527ab132"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.417128 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-scripts" (OuterVolumeSpecName: "scripts") pod "f4f1ce31-53e8-439c-904b-aff4527ab132" (UID: "f4f1ce31-53e8-439c-904b-aff4527ab132"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:29 crc kubenswrapper[4764]: W1001 16:54:29.424980 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b47ff15_96b3_49ac_a5e5_1ce1051d53a0.slice/crio-b1164e697c86434a8ff37d67db5dfc16938c5f0c952d64a6e8e12fe521a90834 WatchSource:0}: Error finding container b1164e697c86434a8ff37d67db5dfc16938c5f0c952d64a6e8e12fe521a90834: Status 404 returned error can't find the container with id b1164e697c86434a8ff37d67db5dfc16938c5f0c952d64a6e8e12fe521a90834 Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.483421 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.483458 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f4f1ce31-53e8-439c-904b-aff4527ab132-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.483468 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdncz\" (UniqueName: \"kubernetes.io/projected/f4f1ce31-53e8-439c-904b-aff4527ab132-kube-api-access-pdncz\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.483864 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.483878 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.483890 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4f1ce31-53e8-439c-904b-aff4527ab132-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.511289 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.582416 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-65467fc489-ql7ls"] Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.586927 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.610126 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-xzsgk"] Oct 01 16:54:29 crc kubenswrapper[4764]: W1001 16:54:29.627005 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod372fcf2a_4cac_4506_b875_aada21327f29.slice/crio-01b22a7135d6446a8750b78450f52dead5ebf51ee1ad77f747fba0de226a6a70 WatchSource:0}: Error finding container 01b22a7135d6446a8750b78450f52dead5ebf51ee1ad77f747fba0de226a6a70: Status 404 returned error can't find the container with id 01b22a7135d6446a8750b78450f52dead5ebf51ee1ad77f747fba0de226a6a70 Oct 01 16:54:29 crc kubenswrapper[4764]: I1001 16:54:29.692554 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.102616 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.131878 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"f4b10553-8ea5-49bf-96cf-c22620f1ced3","Type":"ContainerStarted","Data":"29e8159af67e3cf836c9c6ba320b281a14da2dc34c9a0945837e14fb3c0c4e53"} Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.132883 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65467fc489-ql7ls" event={"ID":"310ac18b-b8b2-4ec2-8c8f-40484023c08d","Type":"ContainerStarted","Data":"e4c0ad4586dae21ec6f2ef4be8992e6e57bb1d2348849c9d70c989b33dd7859d"} Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.133933 4764 generic.go:334] "Generic (PLEG): container finished" podID="372fcf2a-4cac-4506-b875-aada21327f29" containerID="9b679cf0b93bea3fce5f3ae4f8ea91361e8c41366e268cb8007b03a271f88744" exitCode=0 Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.134007 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-xzsgk" event={"ID":"372fcf2a-4cac-4506-b875-aada21327f29","Type":"ContainerDied","Data":"9b679cf0b93bea3fce5f3ae4f8ea91361e8c41366e268cb8007b03a271f88744"} Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.134024 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-xzsgk" event={"ID":"372fcf2a-4cac-4506-b875-aada21327f29","Type":"ContainerStarted","Data":"01b22a7135d6446a8750b78450f52dead5ebf51ee1ad77f747fba0de226a6a70"} Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.137414 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.137830 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0","Type":"ContainerStarted","Data":"b1164e697c86434a8ff37d67db5dfc16938c5f0c952d64a6e8e12fe521a90834"} Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.263837 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.288386 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.294666 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dd67b586f-m6wgn"] Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.306808 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.308633 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.310654 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.312712 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.321023 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:30 crc kubenswrapper[4764]: W1001 16:54:30.368747 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6950085a_b11f_459f_b07b_b53af7a40255.slice/crio-bc7d5cd45b340d65945e707da2c13c61d8d4269509ab5d02772460f50676455c WatchSource:0}: Error finding container bc7d5cd45b340d65945e707da2c13c61d8d4269509ab5d02772460f50676455c: Status 404 returned error can't find the container with id bc7d5cd45b340d65945e707da2c13c61d8d4269509ab5d02772460f50676455c Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.400125 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.400174 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq6hk\" (UniqueName: \"kubernetes.io/projected/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-kube-api-access-zq6hk\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.400227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-config-data\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.400249 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.400268 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-scripts\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.400431 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-logs\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.400584 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.400682 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-ceph\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.400728 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.505840 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-ceph\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.506338 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.506432 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.506468 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq6hk\" (UniqueName: \"kubernetes.io/projected/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-kube-api-access-zq6hk\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.506531 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-config-data\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.506555 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.506596 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-scripts\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.506689 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-logs\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.508990 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.510756 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.511764 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.513122 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-logs\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.517176 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-ceph\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.521492 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-scripts\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.522401 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.522498 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-config-data\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.525726 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.528722 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq6hk\" (UniqueName: \"kubernetes.io/projected/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-kube-api-access-zq6hk\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.557648 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:30 crc kubenswrapper[4764]: I1001 16:54:30.631286 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.156301 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dd67b586f-m6wgn" event={"ID":"6950085a-b11f-459f-b07b-b53af7a40255","Type":"ContainerStarted","Data":"bc7d5cd45b340d65945e707da2c13c61d8d4269509ab5d02772460f50676455c"} Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.159422 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1","Type":"ContainerStarted","Data":"9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a"} Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.159485 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1","Type":"ContainerStarted","Data":"c56d48f85051e3b6b55af2b8e0bbfa6f6ea40b9dbc4a3180e11b4142c06a2c8d"} Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.173308 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0","Type":"ContainerStarted","Data":"9816e9838eece29394f2c219f3300b64795c18a2340c80d3a434f737733eab28"} Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.173359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"8b47ff15-96b3-49ac-a5e5-1ce1051d53a0","Type":"ContainerStarted","Data":"16c1a2011bf93506874ee7c7fc035f4acdc5ca4f272869b6702433612a378390"} Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.180079 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"f4b10553-8ea5-49bf-96cf-c22620f1ced3","Type":"ContainerStarted","Data":"a1890a5fe359496349b7b7e6eaf6f14bde0ac549d84a4a71116f61112b7cc815"} Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.180123 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"f4b10553-8ea5-49bf-96cf-c22620f1ced3","Type":"ContainerStarted","Data":"ab14817c2c7114b012d61ecb032d977ed273a9687023ce02ba10c775ab90c9c5"} Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.214782 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.244569891 podStartE2EDuration="3.214759885s" podCreationTimestamp="2025-10-01 16:54:28 +0000 UTC" firstStartedPulling="2025-10-01 16:54:29.445253868 +0000 UTC m=+3132.444900703" lastFinishedPulling="2025-10-01 16:54:30.415443862 +0000 UTC m=+3133.415090697" observedRunningTime="2025-10-01 16:54:31.205320243 +0000 UTC m=+3134.204967078" watchObservedRunningTime="2025-10-01 16:54:31.214759885 +0000 UTC m=+3134.214406720" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.233579 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.246160 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=1.996976 podStartE2EDuration="3.246136867s" podCreationTimestamp="2025-10-01 16:54:28 +0000 UTC" firstStartedPulling="2025-10-01 16:54:29.159977091 +0000 UTC m=+3132.159623936" lastFinishedPulling="2025-10-01 16:54:30.409137968 +0000 UTC m=+3133.408784803" observedRunningTime="2025-10-01 16:54:31.233551648 +0000 UTC m=+3134.233198483" watchObservedRunningTime="2025-10-01 16:54:31.246136867 +0000 UTC m=+3134.245783702" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.535236 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-xzsgk" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.637275 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9hd8\" (UniqueName: \"kubernetes.io/projected/372fcf2a-4cac-4506-b875-aada21327f29-kube-api-access-d9hd8\") pod \"372fcf2a-4cac-4506-b875-aada21327f29\" (UID: \"372fcf2a-4cac-4506-b875-aada21327f29\") " Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.644552 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5dd67b586f-m6wgn"] Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.662334 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b86858896-mnqsl"] Oct 01 16:54:31 crc kubenswrapper[4764]: E1001 16:54:31.663203 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372fcf2a-4cac-4506-b875-aada21327f29" containerName="mariadb-database-create" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.663220 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="372fcf2a-4cac-4506-b875-aada21327f29" containerName="mariadb-database-create" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.663348 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/372fcf2a-4cac-4506-b875-aada21327f29-kube-api-access-d9hd8" (OuterVolumeSpecName: "kube-api-access-d9hd8") pod "372fcf2a-4cac-4506-b875-aada21327f29" (UID: "372fcf2a-4cac-4506-b875-aada21327f29"). InnerVolumeSpecName "kube-api-access-d9hd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.663596 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="372fcf2a-4cac-4506-b875-aada21327f29" containerName="mariadb-database-create" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.665232 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.675351 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.703823 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.717801 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b86858896-mnqsl"] Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.745575 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05cbc202-6448-46cc-85d0-d4b432506ed5-logs\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.745645 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05cbc202-6448-46cc-85d0-d4b432506ed5-scripts\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.745741 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-combined-ca-bundle\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.745940 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-horizon-secret-key\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.746067 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-horizon-tls-certs\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.746097 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngz7d\" (UniqueName: \"kubernetes.io/projected/05cbc202-6448-46cc-85d0-d4b432506ed5-kube-api-access-ngz7d\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.746165 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05cbc202-6448-46cc-85d0-d4b432506ed5-config-data\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.746345 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9hd8\" (UniqueName: \"kubernetes.io/projected/372fcf2a-4cac-4506-b875-aada21327f29-kube-api-access-d9hd8\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.784000 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f1ce31-53e8-439c-904b-aff4527ab132" path="/var/lib/kubelet/pods/f4f1ce31-53e8-439c-904b-aff4527ab132/volumes" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.789407 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-65467fc489-ql7ls"] Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.825287 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-554f5d45dd-s9w79"] Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.827367 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.848539 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05cbc202-6448-46cc-85d0-d4b432506ed5-logs\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.848573 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05cbc202-6448-46cc-85d0-d4b432506ed5-scripts\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.848607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-combined-ca-bundle\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.848683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-horizon-secret-key\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.848744 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-horizon-tls-certs\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.848767 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngz7d\" (UniqueName: \"kubernetes.io/projected/05cbc202-6448-46cc-85d0-d4b432506ed5-kube-api-access-ngz7d\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.848792 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05cbc202-6448-46cc-85d0-d4b432506ed5-config-data\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.858720 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05cbc202-6448-46cc-85d0-d4b432506ed5-config-data\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.861193 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05cbc202-6448-46cc-85d0-d4b432506ed5-scripts\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.861442 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05cbc202-6448-46cc-85d0-d4b432506ed5-logs\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.866018 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-horizon-secret-key\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.875149 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-horizon-tls-certs\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.875764 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngz7d\" (UniqueName: \"kubernetes.io/projected/05cbc202-6448-46cc-85d0-d4b432506ed5-kube-api-access-ngz7d\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.876300 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.876446 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-combined-ca-bundle\") pod \"horizon-b86858896-mnqsl\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.909986 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-554f5d45dd-s9w79"] Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.962131 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-scripts\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.962404 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-horizon-secret-key\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.962421 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-horizon-tls-certs\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.962482 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-logs\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.962501 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-config-data\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.962541 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-combined-ca-bundle\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:31 crc kubenswrapper[4764]: I1001 16:54:31.962563 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz5pf\" (UniqueName: \"kubernetes.io/projected/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-kube-api-access-sz5pf\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.066103 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-scripts\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.066800 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-scripts\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.066922 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-horizon-tls-certs\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.066942 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-horizon-secret-key\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.067011 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-logs\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.067510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-logs\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.067628 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-config-data\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.067685 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-combined-ca-bundle\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.067710 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz5pf\" (UniqueName: \"kubernetes.io/projected/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-kube-api-access-sz5pf\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.068995 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-config-data\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.073537 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-horizon-secret-key\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.074029 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-combined-ca-bundle\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.075800 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-horizon-tls-certs\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.094644 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz5pf\" (UniqueName: \"kubernetes.io/projected/b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d-kube-api-access-sz5pf\") pod \"horizon-554f5d45dd-s9w79\" (UID: \"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d\") " pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.095247 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.162581 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.216525 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-xzsgk" event={"ID":"372fcf2a-4cac-4506-b875-aada21327f29","Type":"ContainerDied","Data":"01b22a7135d6446a8750b78450f52dead5ebf51ee1ad77f747fba0de226a6a70"} Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.216573 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01b22a7135d6446a8750b78450f52dead5ebf51ee1ad77f747fba0de226a6a70" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.216632 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-xzsgk" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.219229 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf","Type":"ContainerStarted","Data":"473570b7886da0daf32c5b7285bae9f8e36fe55255757b62549f9f8278be6c77"} Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.229792 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" containerName="glance-log" containerID="cri-o://9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a" gracePeriod=30 Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.230182 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1","Type":"ContainerStarted","Data":"ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640"} Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.230578 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" containerName="glance-httpd" containerID="cri-o://ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640" gracePeriod=30 Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.258700 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.258676804 podStartE2EDuration="4.258676804s" podCreationTimestamp="2025-10-01 16:54:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:54:32.254680245 +0000 UTC m=+3135.254327080" watchObservedRunningTime="2025-10-01 16:54:32.258676804 +0000 UTC m=+3135.258323639" Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.728621 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b86858896-mnqsl"] Oct 01 16:54:32 crc kubenswrapper[4764]: W1001 16:54:32.739223 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05cbc202_6448_46cc_85d0_d4b432506ed5.slice/crio-77f129cc3ba76f3dafae42d60132b36a67146c0781067930d937ce62fc83f092 WatchSource:0}: Error finding container 77f129cc3ba76f3dafae42d60132b36a67146c0781067930d937ce62fc83f092: Status 404 returned error can't find the container with id 77f129cc3ba76f3dafae42d60132b36a67146c0781067930d937ce62fc83f092 Oct 01 16:54:32 crc kubenswrapper[4764]: I1001 16:54:32.961818 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-554f5d45dd-s9w79"] Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.141637 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.214441 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-internal-tls-certs\") pod \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.214746 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.214890 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-httpd-run\") pod \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.214924 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcjl5\" (UniqueName: \"kubernetes.io/projected/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-kube-api-access-gcjl5\") pod \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.214947 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-ceph\") pod \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.215063 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-scripts\") pod \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.215096 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-logs\") pod \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.215138 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-config-data\") pod \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.215154 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-combined-ca-bundle\") pod \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\" (UID: \"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1\") " Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.215803 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" (UID: "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.215964 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-logs" (OuterVolumeSpecName: "logs") pod "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" (UID: "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.221876 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-ceph" (OuterVolumeSpecName: "ceph") pod "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" (UID: "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.222179 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-scripts" (OuterVolumeSpecName: "scripts") pod "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" (UID: "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.222586 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-kube-api-access-gcjl5" (OuterVolumeSpecName: "kube-api-access-gcjl5") pod "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" (UID: "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1"). InnerVolumeSpecName "kube-api-access-gcjl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.240765 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" (UID: "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.245386 4764 generic.go:334] "Generic (PLEG): container finished" podID="8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" containerID="ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640" exitCode=143 Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.245608 4764 generic.go:334] "Generic (PLEG): container finished" podID="8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" containerID="9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a" exitCode=143 Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.245550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1","Type":"ContainerDied","Data":"ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640"} Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.246128 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1","Type":"ContainerDied","Data":"9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a"} Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.246259 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1","Type":"ContainerDied","Data":"c56d48f85051e3b6b55af2b8e0bbfa6f6ea40b9dbc4a3180e11b4142c06a2c8d"} Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.245505 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.246351 4764 scope.go:117] "RemoveContainer" containerID="ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.249368 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-554f5d45dd-s9w79" event={"ID":"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d","Type":"ContainerStarted","Data":"fc839b3691ff3039b99993cf25b938c58dc3df22389ecebdeafecaa9a9c9d6d8"} Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.254251 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf","Type":"ContainerStarted","Data":"441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce"} Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.258270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b86858896-mnqsl" event={"ID":"05cbc202-6448-46cc-85d0-d4b432506ed5","Type":"ContainerStarted","Data":"77f129cc3ba76f3dafae42d60132b36a67146c0781067930d937ce62fc83f092"} Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.303284 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" (UID: "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.314529 4764 scope.go:117] "RemoveContainer" containerID="9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.317390 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.317659 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.317714 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.317831 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.317897 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.318010 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcjl5\" (UniqueName: \"kubernetes.io/projected/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-kube-api-access-gcjl5\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.318342 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" (UID: "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.321375 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.323654 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-config-data" (OuterVolumeSpecName: "config-data") pod "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" (UID: "8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.348349 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.422976 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.423005 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.423020 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.538709 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.551532 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.591022 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.621084 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.642276 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 16:54:33 crc kubenswrapper[4764]: E1001 16:54:33.643574 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" containerName="glance-httpd" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.643601 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" containerName="glance-httpd" Oct 01 16:54:33 crc kubenswrapper[4764]: E1001 16:54:33.643620 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" containerName="glance-log" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.643629 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" containerName="glance-log" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.643940 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" containerName="glance-httpd" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.643967 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" containerName="glance-log" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.645414 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.645519 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.648135 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.648383 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.649599 4764 scope.go:117] "RemoveContainer" containerID="ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640" Oct 01 16:54:33 crc kubenswrapper[4764]: E1001 16:54:33.669254 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640\": container with ID starting with ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640 not found: ID does not exist" containerID="ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.669299 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640"} err="failed to get container status \"ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640\": rpc error: code = NotFound desc = could not find container \"ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640\": container with ID starting with ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640 not found: ID does not exist" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.669328 4764 scope.go:117] "RemoveContainer" containerID="9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a" Oct 01 16:54:33 crc kubenswrapper[4764]: E1001 16:54:33.671983 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a\": container with ID starting with 9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a not found: ID does not exist" containerID="9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.672087 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a"} err="failed to get container status \"9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a\": rpc error: code = NotFound desc = could not find container \"9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a\": container with ID starting with 9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a not found: ID does not exist" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.672119 4764 scope.go:117] "RemoveContainer" containerID="ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.676816 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640"} err="failed to get container status \"ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640\": rpc error: code = NotFound desc = could not find container \"ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640\": container with ID starting with ed02e8d9b316e9345695fd2c82439fa4bd7a6354880da3efc66d746e2db91640 not found: ID does not exist" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.676863 4764 scope.go:117] "RemoveContainer" containerID="9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.677323 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a"} err="failed to get container status \"9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a\": rpc error: code = NotFound desc = could not find container \"9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a\": container with ID starting with 9ee065c32310155391415dad61506ecf9cec7031fcd5e7651939529a419f603a not found: ID does not exist" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.732645 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9860202-23f9-492f-b6b7-fd90d113ad6d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.732710 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9860202-23f9-492f-b6b7-fd90d113ad6d-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.732822 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9860202-23f9-492f-b6b7-fd90d113ad6d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.732925 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c9860202-23f9-492f-b6b7-fd90d113ad6d-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.732948 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9860202-23f9-492f-b6b7-fd90d113ad6d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.733014 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9860202-23f9-492f-b6b7-fd90d113ad6d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.733033 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9860202-23f9-492f-b6b7-fd90d113ad6d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.733115 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44n5v\" (UniqueName: \"kubernetes.io/projected/c9860202-23f9-492f-b6b7-fd90d113ad6d-kube-api-access-44n5v\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.733147 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.740847 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1" path="/var/lib/kubelet/pods/8c1ec87a-e1d0-4aff-b5be-e3cbc76edbb1/volumes" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.835872 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9860202-23f9-492f-b6b7-fd90d113ad6d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.835923 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9860202-23f9-492f-b6b7-fd90d113ad6d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.835990 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44n5v\" (UniqueName: \"kubernetes.io/projected/c9860202-23f9-492f-b6b7-fd90d113ad6d-kube-api-access-44n5v\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.836018 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.836146 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9860202-23f9-492f-b6b7-fd90d113ad6d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.836219 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9860202-23f9-492f-b6b7-fd90d113ad6d-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.836346 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9860202-23f9-492f-b6b7-fd90d113ad6d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.836467 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c9860202-23f9-492f-b6b7-fd90d113ad6d-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.836476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9860202-23f9-492f-b6b7-fd90d113ad6d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.836498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9860202-23f9-492f-b6b7-fd90d113ad6d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.836884 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9860202-23f9-492f-b6b7-fd90d113ad6d-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.837074 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.848459 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9860202-23f9-492f-b6b7-fd90d113ad6d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.850445 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9860202-23f9-492f-b6b7-fd90d113ad6d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.851733 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9860202-23f9-492f-b6b7-fd90d113ad6d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.869573 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c9860202-23f9-492f-b6b7-fd90d113ad6d-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.874742 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44n5v\" (UniqueName: \"kubernetes.io/projected/c9860202-23f9-492f-b6b7-fd90d113ad6d-kube-api-access-44n5v\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.877169 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9860202-23f9-492f-b6b7-fd90d113ad6d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.900787 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9860202-23f9-492f-b6b7-fd90d113ad6d\") " pod="openstack/glance-default-internal-api-0" Oct 01 16:54:33 crc kubenswrapper[4764]: I1001 16:54:33.983785 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:34 crc kubenswrapper[4764]: I1001 16:54:34.271138 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf","Type":"ContainerStarted","Data":"66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335"} Oct 01 16:54:34 crc kubenswrapper[4764]: I1001 16:54:34.271501 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" containerName="glance-log" containerID="cri-o://441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce" gracePeriod=30 Oct 01 16:54:34 crc kubenswrapper[4764]: I1001 16:54:34.271689 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" containerName="glance-httpd" containerID="cri-o://66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335" gracePeriod=30 Oct 01 16:54:34 crc kubenswrapper[4764]: I1001 16:54:34.722850 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.722827867 podStartE2EDuration="4.722827867s" podCreationTimestamp="2025-10-01 16:54:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:54:34.29444042 +0000 UTC m=+3137.294087275" watchObservedRunningTime="2025-10-01 16:54:34.722827867 +0000 UTC m=+3137.722474712" Oct 01 16:54:34 crc kubenswrapper[4764]: I1001 16:54:34.787121 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 16:54:34 crc kubenswrapper[4764]: I1001 16:54:34.985511 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.063010 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.063111 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-config-data\") pod \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.063166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-combined-ca-bundle\") pod \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.063204 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-scripts\") pod \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.063229 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-logs\") pod \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.063685 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-logs" (OuterVolumeSpecName: "logs") pod "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" (UID: "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.063792 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq6hk\" (UniqueName: \"kubernetes.io/projected/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-kube-api-access-zq6hk\") pod \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.063850 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-public-tls-certs\") pod \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.063896 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-httpd-run\") pod \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.064090 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-ceph\") pod \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\" (UID: \"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf\") " Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.064860 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.066132 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" (UID: "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.069589 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-ceph" (OuterVolumeSpecName: "ceph") pod "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" (UID: "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.069780 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-scripts" (OuterVolumeSpecName: "scripts") pod "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" (UID: "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.070260 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" (UID: "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.070905 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-kube-api-access-zq6hk" (OuterVolumeSpecName: "kube-api-access-zq6hk") pod "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" (UID: "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf"). InnerVolumeSpecName "kube-api-access-zq6hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.098391 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" (UID: "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.119881 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" (UID: "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.130605 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-config-data" (OuterVolumeSpecName: "config-data") pod "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" (UID: "36ca9806-0e72-4f7b-bc9f-79ce68d7cebf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.169601 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.169671 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.169683 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.169693 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.169708 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.169717 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq6hk\" (UniqueName: \"kubernetes.io/projected/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-kube-api-access-zq6hk\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.169725 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.169734 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.238532 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.271356 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.287363 4764 generic.go:334] "Generic (PLEG): container finished" podID="36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" containerID="66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335" exitCode=0 Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.287400 4764 generic.go:334] "Generic (PLEG): container finished" podID="36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" containerID="441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce" exitCode=143 Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.287516 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.287512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf","Type":"ContainerDied","Data":"66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335"} Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.287565 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf","Type":"ContainerDied","Data":"441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce"} Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.287583 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"36ca9806-0e72-4f7b-bc9f-79ce68d7cebf","Type":"ContainerDied","Data":"473570b7886da0daf32c5b7285bae9f8e36fe55255757b62549f9f8278be6c77"} Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.287602 4764 scope.go:117] "RemoveContainer" containerID="66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.292121 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9860202-23f9-492f-b6b7-fd90d113ad6d","Type":"ContainerStarted","Data":"9f4cde531926776ba95c706a81ad7ed6ca287e4802e364bedda241d3b274e04c"} Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.325669 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.348129 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.364892 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:35 crc kubenswrapper[4764]: E1001 16:54:35.365315 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" containerName="glance-httpd" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.365332 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" containerName="glance-httpd" Oct 01 16:54:35 crc kubenswrapper[4764]: E1001 16:54:35.365350 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" containerName="glance-log" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.365357 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" containerName="glance-log" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.365516 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" containerName="glance-httpd" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.365532 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" containerName="glance-log" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.366486 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.369226 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.371936 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.372681 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.378283 4764 scope.go:117] "RemoveContainer" containerID="441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.474830 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48929539-7e51-4e55-bf3f-d168cab2e600-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.474931 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48929539-7e51-4e55-bf3f-d168cab2e600-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.474988 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48929539-7e51-4e55-bf3f-d168cab2e600-scripts\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.475011 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48929539-7e51-4e55-bf3f-d168cab2e600-ceph\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.475029 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48929539-7e51-4e55-bf3f-d168cab2e600-logs\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.475079 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48929539-7e51-4e55-bf3f-d168cab2e600-config-data\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.475103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfjd4\" (UniqueName: \"kubernetes.io/projected/48929539-7e51-4e55-bf3f-d168cab2e600-kube-api-access-rfjd4\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.475151 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48929539-7e51-4e55-bf3f-d168cab2e600-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.475173 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.488098 4764 scope.go:117] "RemoveContainer" containerID="66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335" Oct 01 16:54:35 crc kubenswrapper[4764]: E1001 16:54:35.488625 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335\": container with ID starting with 66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335 not found: ID does not exist" containerID="66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.488727 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335"} err="failed to get container status \"66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335\": rpc error: code = NotFound desc = could not find container \"66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335\": container with ID starting with 66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335 not found: ID does not exist" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.488859 4764 scope.go:117] "RemoveContainer" containerID="441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce" Oct 01 16:54:35 crc kubenswrapper[4764]: E1001 16:54:35.489420 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce\": container with ID starting with 441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce not found: ID does not exist" containerID="441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.489439 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce"} err="failed to get container status \"441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce\": rpc error: code = NotFound desc = could not find container \"441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce\": container with ID starting with 441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce not found: ID does not exist" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.489453 4764 scope.go:117] "RemoveContainer" containerID="66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.489797 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335"} err="failed to get container status \"66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335\": rpc error: code = NotFound desc = could not find container \"66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335\": container with ID starting with 66532a5a6aa9f12c7ab230b75cbb8496cdcb6f4f09190a873d8157cd2b647335 not found: ID does not exist" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.489843 4764 scope.go:117] "RemoveContainer" containerID="441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.490106 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce"} err="failed to get container status \"441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce\": rpc error: code = NotFound desc = could not find container \"441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce\": container with ID starting with 441d8a66de272c595206db66d6d838def890d9a5077eba804bdd430a9e5b52ce not found: ID does not exist" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.578151 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48929539-7e51-4e55-bf3f-d168cab2e600-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.578823 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48929539-7e51-4e55-bf3f-d168cab2e600-scripts\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.578915 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48929539-7e51-4e55-bf3f-d168cab2e600-ceph\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.578983 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48929539-7e51-4e55-bf3f-d168cab2e600-logs\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.579113 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48929539-7e51-4e55-bf3f-d168cab2e600-config-data\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.579209 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfjd4\" (UniqueName: \"kubernetes.io/projected/48929539-7e51-4e55-bf3f-d168cab2e600-kube-api-access-rfjd4\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.579440 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48929539-7e51-4e55-bf3f-d168cab2e600-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.579514 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.579676 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48929539-7e51-4e55-bf3f-d168cab2e600-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.579740 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48929539-7e51-4e55-bf3f-d168cab2e600-logs\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.579855 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.580385 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48929539-7e51-4e55-bf3f-d168cab2e600-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.586077 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48929539-7e51-4e55-bf3f-d168cab2e600-scripts\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.595915 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfjd4\" (UniqueName: \"kubernetes.io/projected/48929539-7e51-4e55-bf3f-d168cab2e600-kube-api-access-rfjd4\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.607837 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48929539-7e51-4e55-bf3f-d168cab2e600-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.608082 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48929539-7e51-4e55-bf3f-d168cab2e600-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.608463 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48929539-7e51-4e55-bf3f-d168cab2e600-ceph\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.608945 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48929539-7e51-4e55-bf3f-d168cab2e600-config-data\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.652636 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"48929539-7e51-4e55-bf3f-d168cab2e600\") " pod="openstack/glance-default-external-api-0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.722743 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:54:35 crc kubenswrapper[4764]: E1001 16:54:35.723115 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.734646 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36ca9806-0e72-4f7b-bc9f-79ce68d7cebf" path="/var/lib/kubelet/pods/36ca9806-0e72-4f7b-bc9f-79ce68d7cebf/volumes" Oct 01 16:54:35 crc kubenswrapper[4764]: I1001 16:54:35.790441 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 16:54:36 crc kubenswrapper[4764]: I1001 16:54:36.314260 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9860202-23f9-492f-b6b7-fd90d113ad6d","Type":"ContainerStarted","Data":"c49c6ab6ef55ad181deea0b4b20edb95dae37861ddc4f0904f0b9f6492cbed72"} Oct 01 16:54:36 crc kubenswrapper[4764]: I1001 16:54:36.477061 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 16:54:37 crc kubenswrapper[4764]: I1001 16:54:37.337028 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9860202-23f9-492f-b6b7-fd90d113ad6d","Type":"ContainerStarted","Data":"b5e32d34cfe0e2e534b28d364e3d8e486731954bc637e907e99534ef74a13985"} Oct 01 16:54:37 crc kubenswrapper[4764]: I1001 16:54:37.361322 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.361304869 podStartE2EDuration="4.361304869s" podCreationTimestamp="2025-10-01 16:54:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:54:37.359220108 +0000 UTC m=+3140.358866943" watchObservedRunningTime="2025-10-01 16:54:37.361304869 +0000 UTC m=+3140.360951704" Oct 01 16:54:38 crc kubenswrapper[4764]: I1001 16:54:38.750162 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 01 16:54:38 crc kubenswrapper[4764]: I1001 16:54:38.751593 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 01 16:54:38 crc kubenswrapper[4764]: I1001 16:54:38.834141 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-5680-account-create-rcvwg"] Oct 01 16:54:38 crc kubenswrapper[4764]: I1001 16:54:38.837377 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-5680-account-create-rcvwg" Oct 01 16:54:38 crc kubenswrapper[4764]: I1001 16:54:38.839870 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 01 16:54:38 crc kubenswrapper[4764]: I1001 16:54:38.884136 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-5680-account-create-rcvwg"] Oct 01 16:54:38 crc kubenswrapper[4764]: I1001 16:54:38.979701 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j8rr\" (UniqueName: \"kubernetes.io/projected/d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b-kube-api-access-7j8rr\") pod \"manila-5680-account-create-rcvwg\" (UID: \"d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b\") " pod="openstack/manila-5680-account-create-rcvwg" Oct 01 16:54:39 crc kubenswrapper[4764]: I1001 16:54:39.081866 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j8rr\" (UniqueName: \"kubernetes.io/projected/d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b-kube-api-access-7j8rr\") pod \"manila-5680-account-create-rcvwg\" (UID: \"d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b\") " pod="openstack/manila-5680-account-create-rcvwg" Oct 01 16:54:39 crc kubenswrapper[4764]: I1001 16:54:39.100022 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j8rr\" (UniqueName: \"kubernetes.io/projected/d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b-kube-api-access-7j8rr\") pod \"manila-5680-account-create-rcvwg\" (UID: \"d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b\") " pod="openstack/manila-5680-account-create-rcvwg" Oct 01 16:54:39 crc kubenswrapper[4764]: I1001 16:54:39.171986 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-5680-account-create-rcvwg" Oct 01 16:54:43 crc kubenswrapper[4764]: I1001 16:54:43.884267 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-k2274" podUID="d7c7ca03-94fe-4d3b-914b-669bfd41d526" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 16:54:43 crc kubenswrapper[4764]: I1001 16:54:43.984449 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:43 crc kubenswrapper[4764]: I1001 16:54:43.984528 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:44 crc kubenswrapper[4764]: I1001 16:54:44.776232 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/glance-default-internal-api-0" podUID="c9860202-23f9-492f-b6b7-fd90d113ad6d" containerName="glance-httpd" probeResult="failure" output="command timed out" Oct 01 16:54:45 crc kubenswrapper[4764]: I1001 16:54:45.778654 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/glance-default-internal-api-0" podUID="c9860202-23f9-492f-b6b7-fd90d113ad6d" containerName="glance-log" probeResult="failure" output="command timed out" Oct 01 16:54:47 crc kubenswrapper[4764]: I1001 16:54:47.775565 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Oct 01 16:54:48 crc kubenswrapper[4764]: I1001 16:54:48.722176 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:54:48 crc kubenswrapper[4764]: E1001 16:54:48.722781 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:54:53 crc kubenswrapper[4764]: W1001 16:54:53.226338 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48929539_7e51_4e55_bf3f_d168cab2e600.slice/crio-ebcd2c1172def2a63afc039ccc168e76647d17b0bb5434a989c7d36612a8faf3 WatchSource:0}: Error finding container ebcd2c1172def2a63afc039ccc168e76647d17b0bb5434a989c7d36612a8faf3: Status 404 returned error can't find the container with id ebcd2c1172def2a63afc039ccc168e76647d17b0bb5434a989c7d36612a8faf3 Oct 01 16:54:53 crc kubenswrapper[4764]: I1001 16:54:53.554701 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48929539-7e51-4e55-bf3f-d168cab2e600","Type":"ContainerStarted","Data":"ebcd2c1172def2a63afc039ccc168e76647d17b0bb5434a989c7d36612a8faf3"} Oct 01 16:54:53 crc kubenswrapper[4764]: E1001 16:54:53.816346 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 01 16:54:53 crc kubenswrapper[4764]: E1001 16:54:53.817268 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n555hb4h6ch589h57h5cch6dh557h86h685h545h54bhcfh545h647h5d5h5f5h56fh548h5f6h8bh56h68fhc5h566h5d9h8bh98h578h5dh65bh9dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fkdgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-65467fc489-ql7ls_openstack(310ac18b-b8b2-4ec2-8c8f-40484023c08d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 16:54:53 crc kubenswrapper[4764]: E1001 16:54:53.821671 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-65467fc489-ql7ls" podUID="310ac18b-b8b2-4ec2-8c8f-40484023c08d" Oct 01 16:54:53 crc kubenswrapper[4764]: I1001 16:54:53.878216 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-5680-account-create-rcvwg"] Oct 01 16:54:53 crc kubenswrapper[4764]: W1001 16:54:53.884459 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9fc155d_1668_42dd_8db9_9ae2b8fa2a4b.slice/crio-3cd4424e543c14ff4d4c553009a37d97b2eb8f8a63140e4bd7011362313d3838 WatchSource:0}: Error finding container 3cd4424e543c14ff4d4c553009a37d97b2eb8f8a63140e4bd7011362313d3838: Status 404 returned error can't find the container with id 3cd4424e543c14ff4d4c553009a37d97b2eb8f8a63140e4bd7011362313d3838 Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.023960 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.025134 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.046996 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:54 crc kubenswrapper[4764]: E1001 16:54:54.386038 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 01 16:54:54 crc kubenswrapper[4764]: E1001 16:54:54.386290 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n55dhcdhch59h685h594h8fh5fch668h5c7h5d7h5b4hbchb4h66h695hcch558h648h67fh5d8h64h685hb8h77hdbh576h557h5ddh588h579h564q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ngz7d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-b86858896-mnqsl_openstack(05cbc202-6448-46cc-85d0-d4b432506ed5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 16:54:54 crc kubenswrapper[4764]: E1001 16:54:54.389796 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-b86858896-mnqsl" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.582964 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-5680-account-create-rcvwg" event={"ID":"d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b","Type":"ContainerStarted","Data":"bd0d85c512fde3baa853a6fc3609290185a9ae0c8ca0d29b73d35e637c305c03"} Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.583553 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-5680-account-create-rcvwg" event={"ID":"d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b","Type":"ContainerStarted","Data":"3cd4424e543c14ff4d4c553009a37d97b2eb8f8a63140e4bd7011362313d3838"} Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.586989 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48929539-7e51-4e55-bf3f-d168cab2e600","Type":"ContainerStarted","Data":"e0916672db7616fa2dbedd599add3a72f17f1504e4f5d92b651b7fa44b3e3446"} Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.587657 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:54 crc kubenswrapper[4764]: E1001 16:54:54.597254 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-b86858896-mnqsl" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.607715 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-5680-account-create-rcvwg" podStartSLOduration=16.607696161 podStartE2EDuration="16.607696161s" podCreationTimestamp="2025-10-01 16:54:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:54:54.602978845 +0000 UTC m=+3157.602625680" watchObservedRunningTime="2025-10-01 16:54:54.607696161 +0000 UTC m=+3157.607342996" Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.901272 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.973215 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkdgc\" (UniqueName: \"kubernetes.io/projected/310ac18b-b8b2-4ec2-8c8f-40484023c08d-kube-api-access-fkdgc\") pod \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.973291 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/310ac18b-b8b2-4ec2-8c8f-40484023c08d-config-data\") pod \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.973329 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/310ac18b-b8b2-4ec2-8c8f-40484023c08d-horizon-secret-key\") pod \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.973408 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/310ac18b-b8b2-4ec2-8c8f-40484023c08d-logs\") pod \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.973649 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/310ac18b-b8b2-4ec2-8c8f-40484023c08d-scripts\") pod \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\" (UID: \"310ac18b-b8b2-4ec2-8c8f-40484023c08d\") " Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.973897 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/310ac18b-b8b2-4ec2-8c8f-40484023c08d-logs" (OuterVolumeSpecName: "logs") pod "310ac18b-b8b2-4ec2-8c8f-40484023c08d" (UID: "310ac18b-b8b2-4ec2-8c8f-40484023c08d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.974194 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/310ac18b-b8b2-4ec2-8c8f-40484023c08d-scripts" (OuterVolumeSpecName: "scripts") pod "310ac18b-b8b2-4ec2-8c8f-40484023c08d" (UID: "310ac18b-b8b2-4ec2-8c8f-40484023c08d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.974391 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/310ac18b-b8b2-4ec2-8c8f-40484023c08d-config-data" (OuterVolumeSpecName: "config-data") pod "310ac18b-b8b2-4ec2-8c8f-40484023c08d" (UID: "310ac18b-b8b2-4ec2-8c8f-40484023c08d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.974608 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/310ac18b-b8b2-4ec2-8c8f-40484023c08d-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.974630 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/310ac18b-b8b2-4ec2-8c8f-40484023c08d-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.974641 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/310ac18b-b8b2-4ec2-8c8f-40484023c08d-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.979087 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/310ac18b-b8b2-4ec2-8c8f-40484023c08d-kube-api-access-fkdgc" (OuterVolumeSpecName: "kube-api-access-fkdgc") pod "310ac18b-b8b2-4ec2-8c8f-40484023c08d" (UID: "310ac18b-b8b2-4ec2-8c8f-40484023c08d"). InnerVolumeSpecName "kube-api-access-fkdgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:54:54 crc kubenswrapper[4764]: I1001 16:54:54.979087 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/310ac18b-b8b2-4ec2-8c8f-40484023c08d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "310ac18b-b8b2-4ec2-8c8f-40484023c08d" (UID: "310ac18b-b8b2-4ec2-8c8f-40484023c08d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:55 crc kubenswrapper[4764]: E1001 16:54:55.034283 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 01 16:54:55 crc kubenswrapper[4764]: E1001 16:54:55.034442 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n64bhf4h547h578h5dbh5d6h58bh5bfh8fhdbh5cfh58bh646h5d7h575h666h5b6h646h99h67ch5b9h569hb7h549h5c9h587h98h555h6ch5cfh68fh686q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sz5pf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-554f5d45dd-s9w79_openstack(b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 16:54:55 crc kubenswrapper[4764]: E1001 16:54:55.036574 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-554f5d45dd-s9w79" podUID="b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d" Oct 01 16:54:55 crc kubenswrapper[4764]: I1001 16:54:55.077091 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkdgc\" (UniqueName: \"kubernetes.io/projected/310ac18b-b8b2-4ec2-8c8f-40484023c08d-kube-api-access-fkdgc\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:55 crc kubenswrapper[4764]: I1001 16:54:55.077145 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/310ac18b-b8b2-4ec2-8c8f-40484023c08d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:55 crc kubenswrapper[4764]: I1001 16:54:55.597906 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-65467fc489-ql7ls" event={"ID":"310ac18b-b8b2-4ec2-8c8f-40484023c08d","Type":"ContainerDied","Data":"e4c0ad4586dae21ec6f2ef4be8992e6e57bb1d2348849c9d70c989b33dd7859d"} Oct 01 16:54:55 crc kubenswrapper[4764]: I1001 16:54:55.597988 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 16:54:55 crc kubenswrapper[4764]: I1001 16:54:55.598011 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-65467fc489-ql7ls" Oct 01 16:54:55 crc kubenswrapper[4764]: E1001 16:54:55.601018 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-554f5d45dd-s9w79" podUID="b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d" Oct 01 16:54:55 crc kubenswrapper[4764]: I1001 16:54:55.657167 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-65467fc489-ql7ls"] Oct 01 16:54:55 crc kubenswrapper[4764]: I1001 16:54:55.670113 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-65467fc489-ql7ls"] Oct 01 16:54:55 crc kubenswrapper[4764]: I1001 16:54:55.739308 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="310ac18b-b8b2-4ec2-8c8f-40484023c08d" path="/var/lib/kubelet/pods/310ac18b-b8b2-4ec2-8c8f-40484023c08d/volumes" Oct 01 16:54:56 crc kubenswrapper[4764]: E1001 16:54:56.302614 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 01 16:54:56 crc kubenswrapper[4764]: E1001 16:54:56.302880 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n89h67dh7bh8dh59chddh5cdh5bch89h74hc5h564h698h648h5fbhf4h67h54h575h556h57bh554h77h5b4hf9h67fh64dh678h68ch698h64hbdq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h2c29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5dd67b586f-m6wgn_openstack(6950085a-b11f-459f-b07b-b53af7a40255): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 16:54:56 crc kubenswrapper[4764]: E1001 16:54:56.305370 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5dd67b586f-m6wgn" podUID="6950085a-b11f-459f-b07b-b53af7a40255" Oct 01 16:54:56 crc kubenswrapper[4764]: I1001 16:54:56.489086 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:56 crc kubenswrapper[4764]: I1001 16:54:56.492400 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 16:54:56 crc kubenswrapper[4764]: I1001 16:54:56.614927 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48929539-7e51-4e55-bf3f-d168cab2e600","Type":"ContainerStarted","Data":"0f35f65891c434ad03f47271c9d1686fa30451c6e9f2a3b16d3fef31623d21c1"} Oct 01 16:54:56 crc kubenswrapper[4764]: I1001 16:54:56.621114 4764 generic.go:334] "Generic (PLEG): container finished" podID="d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b" containerID="bd0d85c512fde3baa853a6fc3609290185a9ae0c8ca0d29b73d35e637c305c03" exitCode=0 Oct 01 16:54:56 crc kubenswrapper[4764]: I1001 16:54:56.621173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-5680-account-create-rcvwg" event={"ID":"d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b","Type":"ContainerDied","Data":"bd0d85c512fde3baa853a6fc3609290185a9ae0c8ca0d29b73d35e637c305c03"} Oct 01 16:54:56 crc kubenswrapper[4764]: I1001 16:54:56.645105 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=21.645089958 podStartE2EDuration="21.645089958s" podCreationTimestamp="2025-10-01 16:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:54:56.644515074 +0000 UTC m=+3159.644161919" watchObservedRunningTime="2025-10-01 16:54:56.645089958 +0000 UTC m=+3159.644736793" Oct 01 16:54:56 crc kubenswrapper[4764]: I1001 16:54:56.964165 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.113309 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6950085a-b11f-459f-b07b-b53af7a40255-config-data\") pod \"6950085a-b11f-459f-b07b-b53af7a40255\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.113461 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6950085a-b11f-459f-b07b-b53af7a40255-horizon-secret-key\") pod \"6950085a-b11f-459f-b07b-b53af7a40255\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.113512 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2c29\" (UniqueName: \"kubernetes.io/projected/6950085a-b11f-459f-b07b-b53af7a40255-kube-api-access-h2c29\") pod \"6950085a-b11f-459f-b07b-b53af7a40255\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.113728 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6950085a-b11f-459f-b07b-b53af7a40255-logs\") pod \"6950085a-b11f-459f-b07b-b53af7a40255\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.113802 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6950085a-b11f-459f-b07b-b53af7a40255-scripts\") pod \"6950085a-b11f-459f-b07b-b53af7a40255\" (UID: \"6950085a-b11f-459f-b07b-b53af7a40255\") " Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.114447 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6950085a-b11f-459f-b07b-b53af7a40255-scripts" (OuterVolumeSpecName: "scripts") pod "6950085a-b11f-459f-b07b-b53af7a40255" (UID: "6950085a-b11f-459f-b07b-b53af7a40255"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.114902 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6950085a-b11f-459f-b07b-b53af7a40255-logs" (OuterVolumeSpecName: "logs") pod "6950085a-b11f-459f-b07b-b53af7a40255" (UID: "6950085a-b11f-459f-b07b-b53af7a40255"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.115033 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6950085a-b11f-459f-b07b-b53af7a40255-config-data" (OuterVolumeSpecName: "config-data") pod "6950085a-b11f-459f-b07b-b53af7a40255" (UID: "6950085a-b11f-459f-b07b-b53af7a40255"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.120349 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6950085a-b11f-459f-b07b-b53af7a40255-kube-api-access-h2c29" (OuterVolumeSpecName: "kube-api-access-h2c29") pod "6950085a-b11f-459f-b07b-b53af7a40255" (UID: "6950085a-b11f-459f-b07b-b53af7a40255"). InnerVolumeSpecName "kube-api-access-h2c29". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.127318 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6950085a-b11f-459f-b07b-b53af7a40255-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6950085a-b11f-459f-b07b-b53af7a40255" (UID: "6950085a-b11f-459f-b07b-b53af7a40255"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.217122 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6950085a-b11f-459f-b07b-b53af7a40255-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.217184 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2c29\" (UniqueName: \"kubernetes.io/projected/6950085a-b11f-459f-b07b-b53af7a40255-kube-api-access-h2c29\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.217214 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6950085a-b11f-459f-b07b-b53af7a40255-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.217226 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6950085a-b11f-459f-b07b-b53af7a40255-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.217238 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6950085a-b11f-459f-b07b-b53af7a40255-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.637002 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dd67b586f-m6wgn" event={"ID":"6950085a-b11f-459f-b07b-b53af7a40255","Type":"ContainerDied","Data":"bc7d5cd45b340d65945e707da2c13c61d8d4269509ab5d02772460f50676455c"} Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.637217 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dd67b586f-m6wgn" Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.712484 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5dd67b586f-m6wgn"] Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.718940 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5dd67b586f-m6wgn"] Oct 01 16:54:57 crc kubenswrapper[4764]: I1001 16:54:57.741832 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6950085a-b11f-459f-b07b-b53af7a40255" path="/var/lib/kubelet/pods/6950085a-b11f-459f-b07b-b53af7a40255/volumes" Oct 01 16:54:59 crc kubenswrapper[4764]: I1001 16:54:59.589320 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-5680-account-create-rcvwg" Oct 01 16:54:59 crc kubenswrapper[4764]: I1001 16:54:59.660168 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-5680-account-create-rcvwg" event={"ID":"d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b","Type":"ContainerDied","Data":"3cd4424e543c14ff4d4c553009a37d97b2eb8f8a63140e4bd7011362313d3838"} Oct 01 16:54:59 crc kubenswrapper[4764]: I1001 16:54:59.660496 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cd4424e543c14ff4d4c553009a37d97b2eb8f8a63140e4bd7011362313d3838" Oct 01 16:54:59 crc kubenswrapper[4764]: I1001 16:54:59.660264 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-5680-account-create-rcvwg" Oct 01 16:54:59 crc kubenswrapper[4764]: I1001 16:54:59.764994 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j8rr\" (UniqueName: \"kubernetes.io/projected/d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b-kube-api-access-7j8rr\") pod \"d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b\" (UID: \"d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b\") " Oct 01 16:54:59 crc kubenswrapper[4764]: I1001 16:54:59.771158 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b-kube-api-access-7j8rr" (OuterVolumeSpecName: "kube-api-access-7j8rr") pod "d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b" (UID: "d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b"). InnerVolumeSpecName "kube-api-access-7j8rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:54:59 crc kubenswrapper[4764]: I1001 16:54:59.868765 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j8rr\" (UniqueName: \"kubernetes.io/projected/d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b-kube-api-access-7j8rr\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:00 crc kubenswrapper[4764]: I1001 16:55:00.723280 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:55:00 crc kubenswrapper[4764]: E1001 16:55:00.723799 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.243114 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-lcwlv"] Oct 01 16:55:04 crc kubenswrapper[4764]: E1001 16:55:04.244187 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b" containerName="mariadb-account-create" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.244203 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b" containerName="mariadb-account-create" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.244450 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b" containerName="mariadb-account-create" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.245193 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.247831 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-4gmlq" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.248109 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.272776 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-lcwlv"] Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.360162 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-job-config-data\") pod \"manila-db-sync-lcwlv\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.360387 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-combined-ca-bundle\") pod \"manila-db-sync-lcwlv\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.360459 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvpz8\" (UniqueName: \"kubernetes.io/projected/d0e16d19-9b17-40a6-91ff-9572ef612fa3-kube-api-access-vvpz8\") pod \"manila-db-sync-lcwlv\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.360511 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-config-data\") pod \"manila-db-sync-lcwlv\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.462007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-config-data\") pod \"manila-db-sync-lcwlv\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.462173 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-job-config-data\") pod \"manila-db-sync-lcwlv\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.462242 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-combined-ca-bundle\") pod \"manila-db-sync-lcwlv\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.462265 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvpz8\" (UniqueName: \"kubernetes.io/projected/d0e16d19-9b17-40a6-91ff-9572ef612fa3-kube-api-access-vvpz8\") pod \"manila-db-sync-lcwlv\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.468155 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-config-data\") pod \"manila-db-sync-lcwlv\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.468668 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-combined-ca-bundle\") pod \"manila-db-sync-lcwlv\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.469567 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-job-config-data\") pod \"manila-db-sync-lcwlv\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.498666 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvpz8\" (UniqueName: \"kubernetes.io/projected/d0e16d19-9b17-40a6-91ff-9572ef612fa3-kube-api-access-vvpz8\") pod \"manila-db-sync-lcwlv\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:04 crc kubenswrapper[4764]: I1001 16:55:04.572031 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:05 crc kubenswrapper[4764]: I1001 16:55:05.164186 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-lcwlv"] Oct 01 16:55:05 crc kubenswrapper[4764]: I1001 16:55:05.739136 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lcwlv" event={"ID":"d0e16d19-9b17-40a6-91ff-9572ef612fa3","Type":"ContainerStarted","Data":"904f59d387d945f90bd493db6f71b25db781f86c0d608e10ca235d31bfb25bcb"} Oct 01 16:55:05 crc kubenswrapper[4764]: I1001 16:55:05.790814 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 16:55:05 crc kubenswrapper[4764]: I1001 16:55:05.791698 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 16:55:05 crc kubenswrapper[4764]: I1001 16:55:05.791833 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 16:55:05 crc kubenswrapper[4764]: I1001 16:55:05.791954 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 16:55:05 crc kubenswrapper[4764]: I1001 16:55:05.821274 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 16:55:05 crc kubenswrapper[4764]: I1001 16:55:05.836740 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 16:55:08 crc kubenswrapper[4764]: I1001 16:55:08.814425 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 16:55:08 crc kubenswrapper[4764]: I1001 16:55:08.815246 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 16:55:08 crc kubenswrapper[4764]: I1001 16:55:08.838116 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 16:55:10 crc kubenswrapper[4764]: I1001 16:55:10.792329 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-554f5d45dd-s9w79" event={"ID":"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d","Type":"ContainerStarted","Data":"5f4af82f24f43280ee86a018a294de69b8bd7faa301eb59042fd448387e0983a"} Oct 01 16:55:10 crc kubenswrapper[4764]: I1001 16:55:10.793036 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-554f5d45dd-s9w79" event={"ID":"b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d","Type":"ContainerStarted","Data":"1459376d21a62e4d258c69af69082ea95f41460b17d49a6e95b4c8917af71190"} Oct 01 16:55:10 crc kubenswrapper[4764]: I1001 16:55:10.793718 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lcwlv" event={"ID":"d0e16d19-9b17-40a6-91ff-9572ef612fa3","Type":"ContainerStarted","Data":"ccdce9c96c62945a607316ee9a35853031bfb486442bc83bbeb25999ca021519"} Oct 01 16:55:10 crc kubenswrapper[4764]: I1001 16:55:10.795407 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b86858896-mnqsl" event={"ID":"05cbc202-6448-46cc-85d0-d4b432506ed5","Type":"ContainerStarted","Data":"4e9e3b2c976d63754df1aa1411440231cac6f40aea8592e89bf8193095bcfd0a"} Oct 01 16:55:10 crc kubenswrapper[4764]: I1001 16:55:10.795438 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b86858896-mnqsl" event={"ID":"05cbc202-6448-46cc-85d0-d4b432506ed5","Type":"ContainerStarted","Data":"2d7c85ff30844100ee9e0edf53bb3971f83baa32659144e1d89708acae81b02d"} Oct 01 16:55:10 crc kubenswrapper[4764]: I1001 16:55:10.827165 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-554f5d45dd-s9w79" podStartSLOduration=2.822617372 podStartE2EDuration="39.82714563s" podCreationTimestamp="2025-10-01 16:54:31 +0000 UTC" firstStartedPulling="2025-10-01 16:54:33.06591969 +0000 UTC m=+3136.065566525" lastFinishedPulling="2025-10-01 16:55:10.070447948 +0000 UTC m=+3173.070094783" observedRunningTime="2025-10-01 16:55:10.81901346 +0000 UTC m=+3173.818660295" watchObservedRunningTime="2025-10-01 16:55:10.82714563 +0000 UTC m=+3173.826792455" Oct 01 16:55:10 crc kubenswrapper[4764]: I1001 16:55:10.838884 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b86858896-mnqsl" podStartSLOduration=2.458681612 podStartE2EDuration="39.838868268s" podCreationTimestamp="2025-10-01 16:54:31 +0000 UTC" firstStartedPulling="2025-10-01 16:54:32.74757972 +0000 UTC m=+3135.747226555" lastFinishedPulling="2025-10-01 16:55:10.127766376 +0000 UTC m=+3173.127413211" observedRunningTime="2025-10-01 16:55:10.837116276 +0000 UTC m=+3173.836763131" watchObservedRunningTime="2025-10-01 16:55:10.838868268 +0000 UTC m=+3173.838515093" Oct 01 16:55:10 crc kubenswrapper[4764]: I1001 16:55:10.861264 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-lcwlv" podStartSLOduration=1.941712276 podStartE2EDuration="6.861244799s" podCreationTimestamp="2025-10-01 16:55:04 +0000 UTC" firstStartedPulling="2025-10-01 16:55:05.172484075 +0000 UTC m=+3168.172130910" lastFinishedPulling="2025-10-01 16:55:10.092016598 +0000 UTC m=+3173.091663433" observedRunningTime="2025-10-01 16:55:10.850372092 +0000 UTC m=+3173.850018937" watchObservedRunningTime="2025-10-01 16:55:10.861244799 +0000 UTC m=+3173.860891654" Oct 01 16:55:11 crc kubenswrapper[4764]: I1001 16:55:11.722375 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:55:11 crc kubenswrapper[4764]: E1001 16:55:11.722719 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:55:12 crc kubenswrapper[4764]: I1001 16:55:12.098475 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:55:12 crc kubenswrapper[4764]: I1001 16:55:12.098893 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:55:12 crc kubenswrapper[4764]: I1001 16:55:12.163307 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:55:12 crc kubenswrapper[4764]: I1001 16:55:12.163368 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:55:22 crc kubenswrapper[4764]: I1001 16:55:22.105456 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b86858896-mnqsl" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.255:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.255:8443: connect: connection refused" Oct 01 16:55:22 crc kubenswrapper[4764]: I1001 16:55:22.166285 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-554f5d45dd-s9w79" podUID="b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.0:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.0:8443: connect: connection refused" Oct 01 16:55:23 crc kubenswrapper[4764]: I1001 16:55:23.958280 4764 generic.go:334] "Generic (PLEG): container finished" podID="d0e16d19-9b17-40a6-91ff-9572ef612fa3" containerID="ccdce9c96c62945a607316ee9a35853031bfb486442bc83bbeb25999ca021519" exitCode=0 Oct 01 16:55:23 crc kubenswrapper[4764]: I1001 16:55:23.958345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lcwlv" event={"ID":"d0e16d19-9b17-40a6-91ff-9572ef612fa3","Type":"ContainerDied","Data":"ccdce9c96c62945a607316ee9a35853031bfb486442bc83bbeb25999ca021519"} Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.396360 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.509852 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-config-data\") pod \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.510428 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvpz8\" (UniqueName: \"kubernetes.io/projected/d0e16d19-9b17-40a6-91ff-9572ef612fa3-kube-api-access-vvpz8\") pod \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.510846 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-job-config-data\") pod \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.511033 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-combined-ca-bundle\") pod \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\" (UID: \"d0e16d19-9b17-40a6-91ff-9572ef612fa3\") " Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.516467 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e16d19-9b17-40a6-91ff-9572ef612fa3-kube-api-access-vvpz8" (OuterVolumeSpecName: "kube-api-access-vvpz8") pod "d0e16d19-9b17-40a6-91ff-9572ef612fa3" (UID: "d0e16d19-9b17-40a6-91ff-9572ef612fa3"). InnerVolumeSpecName "kube-api-access-vvpz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.516646 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "d0e16d19-9b17-40a6-91ff-9572ef612fa3" (UID: "d0e16d19-9b17-40a6-91ff-9572ef612fa3"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.520832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-config-data" (OuterVolumeSpecName: "config-data") pod "d0e16d19-9b17-40a6-91ff-9572ef612fa3" (UID: "d0e16d19-9b17-40a6-91ff-9572ef612fa3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.560167 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0e16d19-9b17-40a6-91ff-9572ef612fa3" (UID: "d0e16d19-9b17-40a6-91ff-9572ef612fa3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.613988 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvpz8\" (UniqueName: \"kubernetes.io/projected/d0e16d19-9b17-40a6-91ff-9572ef612fa3-kube-api-access-vvpz8\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.614023 4764 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.614032 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.614057 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0e16d19-9b17-40a6-91ff-9572ef612fa3-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.980898 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-lcwlv" event={"ID":"d0e16d19-9b17-40a6-91ff-9572ef612fa3","Type":"ContainerDied","Data":"904f59d387d945f90bd493db6f71b25db781f86c0d608e10ca235d31bfb25bcb"} Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.981230 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="904f59d387d945f90bd493db6f71b25db781f86c0d608e10ca235d31bfb25bcb" Oct 01 16:55:25 crc kubenswrapper[4764]: I1001 16:55:25.981119 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-lcwlv" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.355319 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 16:55:26 crc kubenswrapper[4764]: E1001 16:55:26.356142 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e16d19-9b17-40a6-91ff-9572ef612fa3" containerName="manila-db-sync" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.356244 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e16d19-9b17-40a6-91ff-9572ef612fa3" containerName="manila-db-sync" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.356632 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e16d19-9b17-40a6-91ff-9572ef612fa3" containerName="manila-db-sync" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.358124 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.365581 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-4gmlq" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.365767 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.365867 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.366024 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.380226 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.382374 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.387261 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.401386 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.426966 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431345 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-config-data\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431402 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-config-data\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431440 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxcc6\" (UniqueName: \"kubernetes.io/projected/d0d66a21-186e-484a-b560-31d2fabb01b1-kube-api-access-dxcc6\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431477 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0d66a21-186e-484a-b560-31d2fabb01b1-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431515 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431550 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431566 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9c8c\" (UniqueName: \"kubernetes.io/projected/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-kube-api-access-g9c8c\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431613 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431642 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-scripts\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431669 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431717 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-ceph\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431733 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-scripts\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431763 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.431799 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.540891 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.540946 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.540963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9c8c\" (UniqueName: \"kubernetes.io/projected/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-kube-api-access-g9c8c\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.540989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.541015 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-scripts\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.541039 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.541159 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-ceph\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.541174 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-scripts\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.541205 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.541241 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.541274 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-config-data\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.541301 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-config-data\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.541332 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxcc6\" (UniqueName: \"kubernetes.io/projected/d0d66a21-186e-484a-b560-31d2fabb01b1-kube-api-access-dxcc6\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.541366 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0d66a21-186e-484a-b560-31d2fabb01b1-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.551444 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.552203 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-f4vwb"] Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.553961 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-scripts\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.556022 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.556481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0d66a21-186e-484a-b560-31d2fabb01b1-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.557648 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-scripts\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.557669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.558361 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.563708 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-config-data\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.570613 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.571666 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-config-data\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.575708 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.577083 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.589604 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-ceph\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.600902 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-f4vwb"] Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.605991 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9c8c\" (UniqueName: \"kubernetes.io/projected/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-kube-api-access-g9c8c\") pod \"manila-share-share1-0\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.611400 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxcc6\" (UniqueName: \"kubernetes.io/projected/d0d66a21-186e-484a-b560-31d2fabb01b1-kube-api-access-dxcc6\") pod \"manila-scheduler-0\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.645493 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-config\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.645740 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.645865 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq82f\" (UniqueName: \"kubernetes.io/projected/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-kube-api-access-mq82f\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.645996 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.646114 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.646200 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.676857 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.678686 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.683263 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.688538 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.701861 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.702412 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.722868 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:55:26 crc kubenswrapper[4764]: E1001 16:55:26.723146 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.749125 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cd5132c-247e-4944-85e3-e80bdaeb7a03-etc-machine-id\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.751104 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-config\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.751275 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-scripts\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.751404 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.752110 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-config-data\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.753078 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.753135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.752941 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-config\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.753868 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq82f\" (UniqueName: \"kubernetes.io/projected/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-kube-api-access-mq82f\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.754039 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cd5132c-247e-4944-85e3-e80bdaeb7a03-logs\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.754196 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp2zg\" (UniqueName: \"kubernetes.io/projected/0cd5132c-247e-4944-85e3-e80bdaeb7a03-kube-api-access-lp2zg\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.754232 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-config-data-custom\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.754289 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.754359 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.754456 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.755245 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.755651 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.757210 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.776352 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq82f\" (UniqueName: \"kubernetes.io/projected/4d7d3e82-e4f3-48e1-99ac-949325fec6cb-kube-api-access-mq82f\") pod \"dnsmasq-dns-76b5fdb995-f4vwb\" (UID: \"4d7d3e82-e4f3-48e1-99ac-949325fec6cb\") " pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.856112 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp2zg\" (UniqueName: \"kubernetes.io/projected/0cd5132c-247e-4944-85e3-e80bdaeb7a03-kube-api-access-lp2zg\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.856159 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-config-data-custom\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.856212 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cd5132c-247e-4944-85e3-e80bdaeb7a03-etc-machine-id\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.856261 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-scripts\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.856306 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-config-data\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.856327 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.856382 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cd5132c-247e-4944-85e3-e80bdaeb7a03-logs\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.857119 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cd5132c-247e-4944-85e3-e80bdaeb7a03-logs\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.857185 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cd5132c-247e-4944-85e3-e80bdaeb7a03-etc-machine-id\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.860443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-config-data\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.860570 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.863469 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-scripts\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.872368 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-config-data-custom\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.879412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp2zg\" (UniqueName: \"kubernetes.io/projected/0cd5132c-247e-4944-85e3-e80bdaeb7a03-kube-api-access-lp2zg\") pod \"manila-api-0\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " pod="openstack/manila-api-0" Oct 01 16:55:26 crc kubenswrapper[4764]: I1001 16:55:26.993712 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:27 crc kubenswrapper[4764]: I1001 16:55:27.003670 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 01 16:55:27 crc kubenswrapper[4764]: I1001 16:55:27.178395 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 16:55:27 crc kubenswrapper[4764]: I1001 16:55:27.364724 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 16:55:27 crc kubenswrapper[4764]: I1001 16:55:27.607982 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-f4vwb"] Oct 01 16:55:27 crc kubenswrapper[4764]: W1001 16:55:27.636553 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d7d3e82_e4f3_48e1_99ac_949325fec6cb.slice/crio-d6a95e465b0fdd0ff520dafc622a10aed44764a7f946c0ad19b9558ecf6ecee1 WatchSource:0}: Error finding container d6a95e465b0fdd0ff520dafc622a10aed44764a7f946c0ad19b9558ecf6ecee1: Status 404 returned error can't find the container with id d6a95e465b0fdd0ff520dafc622a10aed44764a7f946c0ad19b9558ecf6ecee1 Oct 01 16:55:27 crc kubenswrapper[4764]: I1001 16:55:27.887773 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 01 16:55:28 crc kubenswrapper[4764]: I1001 16:55:28.031244 4764 generic.go:334] "Generic (PLEG): container finished" podID="4d7d3e82-e4f3-48e1-99ac-949325fec6cb" containerID="3baf4cabc318e2a6947e2174fd0999a482d1eb2873e000d28e8e67b9db60db73" exitCode=0 Oct 01 16:55:28 crc kubenswrapper[4764]: I1001 16:55:28.031302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" event={"ID":"4d7d3e82-e4f3-48e1-99ac-949325fec6cb","Type":"ContainerDied","Data":"3baf4cabc318e2a6947e2174fd0999a482d1eb2873e000d28e8e67b9db60db73"} Oct 01 16:55:28 crc kubenswrapper[4764]: I1001 16:55:28.031328 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" event={"ID":"4d7d3e82-e4f3-48e1-99ac-949325fec6cb","Type":"ContainerStarted","Data":"d6a95e465b0fdd0ff520dafc622a10aed44764a7f946c0ad19b9558ecf6ecee1"} Oct 01 16:55:28 crc kubenswrapper[4764]: I1001 16:55:28.038203 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"0cd5132c-247e-4944-85e3-e80bdaeb7a03","Type":"ContainerStarted","Data":"d264270397dc8369afafaee59628385ce4e8b8142065874a6b4845f01b5c44d4"} Oct 01 16:55:28 crc kubenswrapper[4764]: I1001 16:55:28.039643 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0f3f2b73-15c6-4249-9206-7951bfd2e1a7","Type":"ContainerStarted","Data":"44a53d95a75b1dd569824e8d9dcf3be9548aab4bbe6499ee63f51f77216f4294"} Oct 01 16:55:28 crc kubenswrapper[4764]: I1001 16:55:28.040667 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d0d66a21-186e-484a-b560-31d2fabb01b1","Type":"ContainerStarted","Data":"ce95d46ef5b2473173868748f6eb9dfad57f92fe8c9a26baf780b59ac7c508cd"} Oct 01 16:55:29 crc kubenswrapper[4764]: I1001 16:55:29.061518 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" event={"ID":"4d7d3e82-e4f3-48e1-99ac-949325fec6cb","Type":"ContainerStarted","Data":"2d25d7dd2d8b88bab24972c933647d9ca32cbb2960ea6cb52a9448e79aa8c4f4"} Oct 01 16:55:29 crc kubenswrapper[4764]: I1001 16:55:29.061894 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:29 crc kubenswrapper[4764]: I1001 16:55:29.075611 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"0cd5132c-247e-4944-85e3-e80bdaeb7a03","Type":"ContainerStarted","Data":"889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36"} Oct 01 16:55:29 crc kubenswrapper[4764]: I1001 16:55:29.077572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d0d66a21-186e-484a-b560-31d2fabb01b1","Type":"ContainerStarted","Data":"6ee5d01f3e771b82e6710b06dbe360e8dd9d52a199280b316de10e34177675c7"} Oct 01 16:55:29 crc kubenswrapper[4764]: I1001 16:55:29.087337 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" podStartSLOduration=3.087320679 podStartE2EDuration="3.087320679s" podCreationTimestamp="2025-10-01 16:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:55:29.081981517 +0000 UTC m=+3192.081628352" watchObservedRunningTime="2025-10-01 16:55:29.087320679 +0000 UTC m=+3192.086967514" Oct 01 16:55:29 crc kubenswrapper[4764]: I1001 16:55:29.628281 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 01 16:55:30 crc kubenswrapper[4764]: I1001 16:55:30.089714 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"0cd5132c-247e-4944-85e3-e80bdaeb7a03","Type":"ContainerStarted","Data":"b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100"} Oct 01 16:55:30 crc kubenswrapper[4764]: I1001 16:55:30.089898 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 01 16:55:30 crc kubenswrapper[4764]: I1001 16:55:30.093880 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d0d66a21-186e-484a-b560-31d2fabb01b1","Type":"ContainerStarted","Data":"e0228c4449e380c4c919ba5daf54fc18684d6af86252d2ffe795a01c0d656bc2"} Oct 01 16:55:30 crc kubenswrapper[4764]: I1001 16:55:30.114823 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.114800432 podStartE2EDuration="4.114800432s" podCreationTimestamp="2025-10-01 16:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:55:30.1061794 +0000 UTC m=+3193.105826255" watchObservedRunningTime="2025-10-01 16:55:30.114800432 +0000 UTC m=+3193.114447267" Oct 01 16:55:30 crc kubenswrapper[4764]: I1001 16:55:30.130629 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.29020439 podStartE2EDuration="4.130611862s" podCreationTimestamp="2025-10-01 16:55:26 +0000 UTC" firstStartedPulling="2025-10-01 16:55:27.221690127 +0000 UTC m=+3190.221336962" lastFinishedPulling="2025-10-01 16:55:28.062097599 +0000 UTC m=+3191.061744434" observedRunningTime="2025-10-01 16:55:30.124500301 +0000 UTC m=+3193.124147136" watchObservedRunningTime="2025-10-01 16:55:30.130611862 +0000 UTC m=+3193.130258687" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.113195 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="0cd5132c-247e-4944-85e3-e80bdaeb7a03" containerName="manila-api-log" containerID="cri-o://889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36" gracePeriod=30 Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.113233 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="0cd5132c-247e-4944-85e3-e80bdaeb7a03" containerName="manila-api" containerID="cri-o://b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100" gracePeriod=30 Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.766509 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.862560 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-combined-ca-bundle\") pod \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.862624 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cd5132c-247e-4944-85e3-e80bdaeb7a03-logs\") pod \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.862665 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-config-data-custom\") pod \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.862746 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cd5132c-247e-4944-85e3-e80bdaeb7a03-etc-machine-id\") pod \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.862898 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-scripts\") pod \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.862918 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-config-data\") pod \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.862973 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp2zg\" (UniqueName: \"kubernetes.io/projected/0cd5132c-247e-4944-85e3-e80bdaeb7a03-kube-api-access-lp2zg\") pod \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\" (UID: \"0cd5132c-247e-4944-85e3-e80bdaeb7a03\") " Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.862984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0cd5132c-247e-4944-85e3-e80bdaeb7a03-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0cd5132c-247e-4944-85e3-e80bdaeb7a03" (UID: "0cd5132c-247e-4944-85e3-e80bdaeb7a03"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.863601 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cd5132c-247e-4944-85e3-e80bdaeb7a03-logs" (OuterVolumeSpecName: "logs") pod "0cd5132c-247e-4944-85e3-e80bdaeb7a03" (UID: "0cd5132c-247e-4944-85e3-e80bdaeb7a03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.865002 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0cd5132c-247e-4944-85e3-e80bdaeb7a03-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.865021 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cd5132c-247e-4944-85e3-e80bdaeb7a03-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.869126 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd5132c-247e-4944-85e3-e80bdaeb7a03-kube-api-access-lp2zg" (OuterVolumeSpecName: "kube-api-access-lp2zg") pod "0cd5132c-247e-4944-85e3-e80bdaeb7a03" (UID: "0cd5132c-247e-4944-85e3-e80bdaeb7a03"). InnerVolumeSpecName "kube-api-access-lp2zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.887632 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-scripts" (OuterVolumeSpecName: "scripts") pod "0cd5132c-247e-4944-85e3-e80bdaeb7a03" (UID: "0cd5132c-247e-4944-85e3-e80bdaeb7a03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.890379 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0cd5132c-247e-4944-85e3-e80bdaeb7a03" (UID: "0cd5132c-247e-4944-85e3-e80bdaeb7a03"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.892106 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cd5132c-247e-4944-85e3-e80bdaeb7a03" (UID: "0cd5132c-247e-4944-85e3-e80bdaeb7a03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.920475 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-config-data" (OuterVolumeSpecName: "config-data") pod "0cd5132c-247e-4944-85e3-e80bdaeb7a03" (UID: "0cd5132c-247e-4944-85e3-e80bdaeb7a03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.966710 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.966745 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.966756 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp2zg\" (UniqueName: \"kubernetes.io/projected/0cd5132c-247e-4944-85e3-e80bdaeb7a03-kube-api-access-lp2zg\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.966766 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:31 crc kubenswrapper[4764]: I1001 16:55:31.966776 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cd5132c-247e-4944-85e3-e80bdaeb7a03-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.050222 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.050747 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="proxy-httpd" containerID="cri-o://35b16b12861892a707ba8d2e07d2c3cee27c29e43f6be0e048fb012bb282f9f2" gracePeriod=30 Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.050967 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="sg-core" containerID="cri-o://3a823132482b25719c0252dfc36b189fcb49373c7366e35a99ce58342229cf82" gracePeriod=30 Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.051083 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="ceilometer-notification-agent" containerID="cri-o://ad837dda32772c7267432ea5793da46b58b872080a91f39cf12568664f252023" gracePeriod=30 Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.051109 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="ceilometer-central-agent" containerID="cri-o://ad1413944c0e57d62005405d20adc25e72f611b869671fd6b77b759aed6621cb" gracePeriod=30 Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.096152 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b86858896-mnqsl" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.255:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.255:8443: connect: connection refused" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.155333 4764 generic.go:334] "Generic (PLEG): container finished" podID="0cd5132c-247e-4944-85e3-e80bdaeb7a03" containerID="b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100" exitCode=0 Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.155379 4764 generic.go:334] "Generic (PLEG): container finished" podID="0cd5132c-247e-4944-85e3-e80bdaeb7a03" containerID="889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36" exitCode=143 Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.155409 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"0cd5132c-247e-4944-85e3-e80bdaeb7a03","Type":"ContainerDied","Data":"b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100"} Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.155443 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"0cd5132c-247e-4944-85e3-e80bdaeb7a03","Type":"ContainerDied","Data":"889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36"} Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.155459 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"0cd5132c-247e-4944-85e3-e80bdaeb7a03","Type":"ContainerDied","Data":"d264270397dc8369afafaee59628385ce4e8b8142065874a6b4845f01b5c44d4"} Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.155477 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.155493 4764 scope.go:117] "RemoveContainer" containerID="b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.164162 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-554f5d45dd-s9w79" podUID="b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.0:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.0:8443: connect: connection refused" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.194311 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.219183 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.231212 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 01 16:55:32 crc kubenswrapper[4764]: E1001 16:55:32.231621 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd5132c-247e-4944-85e3-e80bdaeb7a03" containerName="manila-api" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.231638 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd5132c-247e-4944-85e3-e80bdaeb7a03" containerName="manila-api" Oct 01 16:55:32 crc kubenswrapper[4764]: E1001 16:55:32.231662 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd5132c-247e-4944-85e3-e80bdaeb7a03" containerName="manila-api-log" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.231669 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd5132c-247e-4944-85e3-e80bdaeb7a03" containerName="manila-api-log" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.231832 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd5132c-247e-4944-85e3-e80bdaeb7a03" containerName="manila-api" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.231859 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd5132c-247e-4944-85e3-e80bdaeb7a03" containerName="manila-api-log" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.232849 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.233819 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.259840 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.260348 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.261252 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.386391 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-public-tls-certs\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.386438 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.386583 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-logs\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.386729 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-config-data-custom\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.386751 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-scripts\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.386766 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dktd9\" (UniqueName: \"kubernetes.io/projected/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-kube-api-access-dktd9\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.386865 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-etc-machine-id\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.386973 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-config-data\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.387270 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.489423 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-config-data-custom\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.489462 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dktd9\" (UniqueName: \"kubernetes.io/projected/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-kube-api-access-dktd9\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.489480 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-scripts\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.489501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-etc-machine-id\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.489524 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-config-data\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.489564 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.489661 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-public-tls-certs\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.489683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.489701 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-logs\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.490150 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-logs\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.490584 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-etc-machine-id\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.495114 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-public-tls-certs\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.500684 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-config-data-custom\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.502483 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-config-data\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.502952 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-internal-tls-certs\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.508691 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-scripts\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.511241 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.515614 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dktd9\" (UniqueName: \"kubernetes.io/projected/aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e-kube-api-access-dktd9\") pod \"manila-api-0\" (UID: \"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e\") " pod="openstack/manila-api-0" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.570019 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.186:3000/\": dial tcp 10.217.0.186:3000: connect: connection refused" Oct 01 16:55:32 crc kubenswrapper[4764]: I1001 16:55:32.596665 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 01 16:55:33 crc kubenswrapper[4764]: I1001 16:55:33.168171 4764 generic.go:334] "Generic (PLEG): container finished" podID="dee403a9-1cba-407b-9235-187a8553761d" containerID="35b16b12861892a707ba8d2e07d2c3cee27c29e43f6be0e048fb012bb282f9f2" exitCode=0 Oct 01 16:55:33 crc kubenswrapper[4764]: I1001 16:55:33.168487 4764 generic.go:334] "Generic (PLEG): container finished" podID="dee403a9-1cba-407b-9235-187a8553761d" containerID="3a823132482b25719c0252dfc36b189fcb49373c7366e35a99ce58342229cf82" exitCode=2 Oct 01 16:55:33 crc kubenswrapper[4764]: I1001 16:55:33.168497 4764 generic.go:334] "Generic (PLEG): container finished" podID="dee403a9-1cba-407b-9235-187a8553761d" containerID="ad837dda32772c7267432ea5793da46b58b872080a91f39cf12568664f252023" exitCode=0 Oct 01 16:55:33 crc kubenswrapper[4764]: I1001 16:55:33.168505 4764 generic.go:334] "Generic (PLEG): container finished" podID="dee403a9-1cba-407b-9235-187a8553761d" containerID="ad1413944c0e57d62005405d20adc25e72f611b869671fd6b77b759aed6621cb" exitCode=0 Oct 01 16:55:33 crc kubenswrapper[4764]: I1001 16:55:33.168242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dee403a9-1cba-407b-9235-187a8553761d","Type":"ContainerDied","Data":"35b16b12861892a707ba8d2e07d2c3cee27c29e43f6be0e048fb012bb282f9f2"} Oct 01 16:55:33 crc kubenswrapper[4764]: I1001 16:55:33.168538 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dee403a9-1cba-407b-9235-187a8553761d","Type":"ContainerDied","Data":"3a823132482b25719c0252dfc36b189fcb49373c7366e35a99ce58342229cf82"} Oct 01 16:55:33 crc kubenswrapper[4764]: I1001 16:55:33.168551 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dee403a9-1cba-407b-9235-187a8553761d","Type":"ContainerDied","Data":"ad837dda32772c7267432ea5793da46b58b872080a91f39cf12568664f252023"} Oct 01 16:55:33 crc kubenswrapper[4764]: I1001 16:55:33.168563 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dee403a9-1cba-407b-9235-187a8553761d","Type":"ContainerDied","Data":"ad1413944c0e57d62005405d20adc25e72f611b869671fd6b77b759aed6621cb"} Oct 01 16:55:33 crc kubenswrapper[4764]: I1001 16:55:33.738675 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cd5132c-247e-4944-85e3-e80bdaeb7a03" path="/var/lib/kubelet/pods/0cd5132c-247e-4944-85e3-e80bdaeb7a03/volumes" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.180246 4764 scope.go:117] "RemoveContainer" containerID="889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.561897 4764 scope.go:117] "RemoveContainer" containerID="b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100" Oct 01 16:55:35 crc kubenswrapper[4764]: E1001 16:55:35.562799 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100\": container with ID starting with b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100 not found: ID does not exist" containerID="b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.562824 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100"} err="failed to get container status \"b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100\": rpc error: code = NotFound desc = could not find container \"b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100\": container with ID starting with b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100 not found: ID does not exist" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.562843 4764 scope.go:117] "RemoveContainer" containerID="889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36" Oct 01 16:55:35 crc kubenswrapper[4764]: E1001 16:55:35.566262 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36\": container with ID starting with 889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36 not found: ID does not exist" containerID="889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.566298 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36"} err="failed to get container status \"889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36\": rpc error: code = NotFound desc = could not find container \"889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36\": container with ID starting with 889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36 not found: ID does not exist" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.566328 4764 scope.go:117] "RemoveContainer" containerID="b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.566594 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100"} err="failed to get container status \"b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100\": rpc error: code = NotFound desc = could not find container \"b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100\": container with ID starting with b27d2971d5bd56b338de34b09cbca505586b5f18c0a6eaa5a26cd24191378100 not found: ID does not exist" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.566611 4764 scope.go:117] "RemoveContainer" containerID="889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.566818 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36"} err="failed to get container status \"889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36\": rpc error: code = NotFound desc = could not find container \"889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36\": container with ID starting with 889e2ce42c9cbb8a15b198f81c590eb387ed9201bef87ac80df7691e1bf4dc36 not found: ID does not exist" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.680897 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.862510 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfzcr\" (UniqueName: \"kubernetes.io/projected/dee403a9-1cba-407b-9235-187a8553761d-kube-api-access-jfzcr\") pod \"dee403a9-1cba-407b-9235-187a8553761d\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.862639 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dee403a9-1cba-407b-9235-187a8553761d-log-httpd\") pod \"dee403a9-1cba-407b-9235-187a8553761d\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.862695 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-config-data\") pod \"dee403a9-1cba-407b-9235-187a8553761d\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.862736 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dee403a9-1cba-407b-9235-187a8553761d-run-httpd\") pod \"dee403a9-1cba-407b-9235-187a8553761d\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.862757 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-scripts\") pod \"dee403a9-1cba-407b-9235-187a8553761d\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.862789 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-combined-ca-bundle\") pod \"dee403a9-1cba-407b-9235-187a8553761d\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.862917 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-ceilometer-tls-certs\") pod \"dee403a9-1cba-407b-9235-187a8553761d\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.862955 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-sg-core-conf-yaml\") pod \"dee403a9-1cba-407b-9235-187a8553761d\" (UID: \"dee403a9-1cba-407b-9235-187a8553761d\") " Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.864246 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dee403a9-1cba-407b-9235-187a8553761d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dee403a9-1cba-407b-9235-187a8553761d" (UID: "dee403a9-1cba-407b-9235-187a8553761d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.864621 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dee403a9-1cba-407b-9235-187a8553761d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dee403a9-1cba-407b-9235-187a8553761d" (UID: "dee403a9-1cba-407b-9235-187a8553761d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.867393 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.869449 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dee403a9-1cba-407b-9235-187a8553761d-kube-api-access-jfzcr" (OuterVolumeSpecName: "kube-api-access-jfzcr") pod "dee403a9-1cba-407b-9235-187a8553761d" (UID: "dee403a9-1cba-407b-9235-187a8553761d"). InnerVolumeSpecName "kube-api-access-jfzcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.870894 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-scripts" (OuterVolumeSpecName: "scripts") pod "dee403a9-1cba-407b-9235-187a8553761d" (UID: "dee403a9-1cba-407b-9235-187a8553761d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.894186 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dee403a9-1cba-407b-9235-187a8553761d" (UID: "dee403a9-1cba-407b-9235-187a8553761d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.925325 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "dee403a9-1cba-407b-9235-187a8553761d" (UID: "dee403a9-1cba-407b-9235-187a8553761d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.965576 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.965609 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfzcr\" (UniqueName: \"kubernetes.io/projected/dee403a9-1cba-407b-9235-187a8553761d-kube-api-access-jfzcr\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.965620 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dee403a9-1cba-407b-9235-187a8553761d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.965628 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dee403a9-1cba-407b-9235-187a8553761d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.965637 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.965646 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.967097 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dee403a9-1cba-407b-9235-187a8553761d" (UID: "dee403a9-1cba-407b-9235-187a8553761d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:35 crc kubenswrapper[4764]: I1001 16:55:35.970986 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-config-data" (OuterVolumeSpecName: "config-data") pod "dee403a9-1cba-407b-9235-187a8553761d" (UID: "dee403a9-1cba-407b-9235-187a8553761d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.066884 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.066911 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dee403a9-1cba-407b-9235-187a8553761d-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.219206 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dee403a9-1cba-407b-9235-187a8553761d","Type":"ContainerDied","Data":"bb98e5eb03e5d1ffe0c479129fc3720df8864e5df7aa54920d4db978d399b55e"} Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.219529 4764 scope.go:117] "RemoveContainer" containerID="35b16b12861892a707ba8d2e07d2c3cee27c29e43f6be0e048fb012bb282f9f2" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.219656 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.244965 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0f3f2b73-15c6-4249-9206-7951bfd2e1a7","Type":"ContainerStarted","Data":"268a56a3776ad5abf45ff01451bfaf631cdffa6b439721b5a272f55e3321d8d9"} Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.252034 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e","Type":"ContainerStarted","Data":"b1d68adf53914cf81df7757cfb54b2dc69b903e98b7fafdc1d1906c554968b2f"} Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.338896 4764 scope.go:117] "RemoveContainer" containerID="3a823132482b25719c0252dfc36b189fcb49373c7366e35a99ce58342229cf82" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.363106 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.366776 4764 scope.go:117] "RemoveContainer" containerID="ad837dda32772c7267432ea5793da46b58b872080a91f39cf12568664f252023" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.377222 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.400482 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:55:36 crc kubenswrapper[4764]: E1001 16:55:36.400862 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="ceilometer-central-agent" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.400879 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="ceilometer-central-agent" Oct 01 16:55:36 crc kubenswrapper[4764]: E1001 16:55:36.400896 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="sg-core" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.400902 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="sg-core" Oct 01 16:55:36 crc kubenswrapper[4764]: E1001 16:55:36.400912 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="ceilometer-notification-agent" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.400918 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="ceilometer-notification-agent" Oct 01 16:55:36 crc kubenswrapper[4764]: E1001 16:55:36.400928 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="proxy-httpd" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.400933 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="proxy-httpd" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.401129 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="ceilometer-central-agent" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.401146 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="proxy-httpd" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.401154 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="sg-core" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.401161 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="dee403a9-1cba-407b-9235-187a8553761d" containerName="ceilometer-notification-agent" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.402747 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.406422 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.406518 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.406653 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.413705 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.433372 4764 scope.go:117] "RemoveContainer" containerID="ad1413944c0e57d62005405d20adc25e72f611b869671fd6b77b759aed6621cb" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.472414 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l6br\" (UniqueName: \"kubernetes.io/projected/6df3463f-6f50-46f1-9d22-b674225b7da9-kube-api-access-8l6br\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.472469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.472517 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-scripts\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.472546 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-config-data\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.472608 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.472638 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6df3463f-6f50-46f1-9d22-b674225b7da9-run-httpd\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.472677 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6df3463f-6f50-46f1-9d22-b674225b7da9-log-httpd\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.472746 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.574102 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6df3463f-6f50-46f1-9d22-b674225b7da9-run-httpd\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.574458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6df3463f-6f50-46f1-9d22-b674225b7da9-log-httpd\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.574507 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6df3463f-6f50-46f1-9d22-b674225b7da9-run-httpd\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.574574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.574622 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l6br\" (UniqueName: \"kubernetes.io/projected/6df3463f-6f50-46f1-9d22-b674225b7da9-kube-api-access-8l6br\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.574678 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.574764 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-scripts\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.574808 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-config-data\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.574849 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6df3463f-6f50-46f1-9d22-b674225b7da9-log-httpd\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.574939 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.579832 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.580338 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.580791 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-config-data\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.580990 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.581339 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-scripts\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.601303 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l6br\" (UniqueName: \"kubernetes.io/projected/6df3463f-6f50-46f1-9d22-b674225b7da9-kube-api-access-8l6br\") pod \"ceilometer-0\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.703281 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.731801 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:55:36 crc kubenswrapper[4764]: I1001 16:55:36.996224 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b5fdb995-f4vwb" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.070220 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-f7bn8"] Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.070433 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" podUID="ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" containerName="dnsmasq-dns" containerID="cri-o://f40c13c0ad3da5ee48ba730bc10a564b9bfef1fa53bd6b296f3cad125a7a5854" gracePeriod=10 Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.225107 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.281126 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0f3f2b73-15c6-4249-9206-7951bfd2e1a7","Type":"ContainerStarted","Data":"00b0ce2e9c7a0ee4bd98d8086d80e972394da2284f9823baa3132bcee98bce80"} Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.290364 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6df3463f-6f50-46f1-9d22-b674225b7da9","Type":"ContainerStarted","Data":"c222fb854bb919bef5c950806e5a893b07d48e4bf8a85947ced6e548238fdda8"} Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.302729 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e","Type":"ContainerStarted","Data":"5b2a1199c4b7cbb231a85eae59e72ac0b1d4c395d6750b55774a43a68e3fd99e"} Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.302784 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e","Type":"ContainerStarted","Data":"e5d29f6f5768c31cea625af23084361455be096db63fd55a03157dce8a61472a"} Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.303192 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.311699 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.122700489 podStartE2EDuration="11.311665553s" podCreationTimestamp="2025-10-01 16:55:26 +0000 UTC" firstStartedPulling="2025-10-01 16:55:27.377171232 +0000 UTC m=+3190.376818067" lastFinishedPulling="2025-10-01 16:55:35.566136296 +0000 UTC m=+3198.565783131" observedRunningTime="2025-10-01 16:55:37.305809648 +0000 UTC m=+3200.305456493" watchObservedRunningTime="2025-10-01 16:55:37.311665553 +0000 UTC m=+3200.311312388" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.349638 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=5.349621406 podStartE2EDuration="5.349621406s" podCreationTimestamp="2025-10-01 16:55:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:55:37.335266393 +0000 UTC m=+3200.334913228" watchObservedRunningTime="2025-10-01 16:55:37.349621406 +0000 UTC m=+3200.349268241" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.353703 4764 generic.go:334] "Generic (PLEG): container finished" podID="ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" containerID="f40c13c0ad3da5ee48ba730bc10a564b9bfef1fa53bd6b296f3cad125a7a5854" exitCode=0 Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.353888 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" event={"ID":"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4","Type":"ContainerDied","Data":"f40c13c0ad3da5ee48ba730bc10a564b9bfef1fa53bd6b296f3cad125a7a5854"} Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.654307 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.735503 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dee403a9-1cba-407b-9235-187a8553761d" path="/var/lib/kubelet/pods/dee403a9-1cba-407b-9235-187a8553761d/volumes" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.800577 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-ovsdbserver-nb\") pod \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.800820 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-openstack-edpm-ipam\") pod \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.800863 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-config\") pod \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.800892 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-ovsdbserver-sb\") pod \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.800936 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m957\" (UniqueName: \"kubernetes.io/projected/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-kube-api-access-8m957\") pod \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.800995 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-dns-svc\") pod \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\" (UID: \"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4\") " Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.807382 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-kube-api-access-8m957" (OuterVolumeSpecName: "kube-api-access-8m957") pod "ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" (UID: "ff2f77d9-2e42-4764-b0b5-6a4b13877ed4"). InnerVolumeSpecName "kube-api-access-8m957". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.851327 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-config" (OuterVolumeSpecName: "config") pod "ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" (UID: "ff2f77d9-2e42-4764-b0b5-6a4b13877ed4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.857900 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" (UID: "ff2f77d9-2e42-4764-b0b5-6a4b13877ed4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.858134 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" (UID: "ff2f77d9-2e42-4764-b0b5-6a4b13877ed4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.865244 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" (UID: "ff2f77d9-2e42-4764-b0b5-6a4b13877ed4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.879143 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" (UID: "ff2f77d9-2e42-4764-b0b5-6a4b13877ed4"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.903302 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.903338 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-config\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.903352 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.903366 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m957\" (UniqueName: \"kubernetes.io/projected/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-kube-api-access-8m957\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.903378 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:37 crc kubenswrapper[4764]: I1001 16:55:37.903390 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:38 crc kubenswrapper[4764]: I1001 16:55:38.371296 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" Oct 01 16:55:38 crc kubenswrapper[4764]: I1001 16:55:38.371364 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-f7bn8" event={"ID":"ff2f77d9-2e42-4764-b0b5-6a4b13877ed4","Type":"ContainerDied","Data":"fab9eeccb127c2f580a085ae7da46c987a85005632ecd55d25908a3be04e5bac"} Oct 01 16:55:38 crc kubenswrapper[4764]: I1001 16:55:38.372178 4764 scope.go:117] "RemoveContainer" containerID="f40c13c0ad3da5ee48ba730bc10a564b9bfef1fa53bd6b296f3cad125a7a5854" Oct 01 16:55:38 crc kubenswrapper[4764]: I1001 16:55:38.406936 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-f7bn8"] Oct 01 16:55:38 crc kubenswrapper[4764]: I1001 16:55:38.415969 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-f7bn8"] Oct 01 16:55:38 crc kubenswrapper[4764]: I1001 16:55:38.440027 4764 scope.go:117] "RemoveContainer" containerID="cb5432978d44c0c2e729353a7af8c1642f9976509ecca80bd3dbe2945ec85b3e" Oct 01 16:55:38 crc kubenswrapper[4764]: I1001 16:55:38.721506 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:55:38 crc kubenswrapper[4764]: E1001 16:55:38.721732 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:55:39 crc kubenswrapper[4764]: I1001 16:55:39.384124 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6df3463f-6f50-46f1-9d22-b674225b7da9","Type":"ContainerStarted","Data":"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca"} Oct 01 16:55:39 crc kubenswrapper[4764]: I1001 16:55:39.741923 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" path="/var/lib/kubelet/pods/ff2f77d9-2e42-4764-b0b5-6a4b13877ed4/volumes" Oct 01 16:55:39 crc kubenswrapper[4764]: I1001 16:55:39.989321 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:55:40 crc kubenswrapper[4764]: I1001 16:55:40.395173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6df3463f-6f50-46f1-9d22-b674225b7da9","Type":"ContainerStarted","Data":"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44"} Oct 01 16:55:41 crc kubenswrapper[4764]: I1001 16:55:41.407021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6df3463f-6f50-46f1-9d22-b674225b7da9","Type":"ContainerStarted","Data":"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77"} Oct 01 16:55:43 crc kubenswrapper[4764]: I1001 16:55:43.427488 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6df3463f-6f50-46f1-9d22-b674225b7da9","Type":"ContainerStarted","Data":"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29"} Oct 01 16:55:43 crc kubenswrapper[4764]: I1001 16:55:43.427948 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="ceilometer-central-agent" containerID="cri-o://0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca" gracePeriod=30 Oct 01 16:55:43 crc kubenswrapper[4764]: I1001 16:55:43.428252 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 16:55:43 crc kubenswrapper[4764]: I1001 16:55:43.428487 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="proxy-httpd" containerID="cri-o://33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29" gracePeriod=30 Oct 01 16:55:43 crc kubenswrapper[4764]: I1001 16:55:43.428526 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="sg-core" containerID="cri-o://fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77" gracePeriod=30 Oct 01 16:55:43 crc kubenswrapper[4764]: I1001 16:55:43.428556 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="ceilometer-notification-agent" containerID="cri-o://9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44" gracePeriod=30 Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.193805 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.359025 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-combined-ca-bundle\") pod \"6df3463f-6f50-46f1-9d22-b674225b7da9\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.359379 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l6br\" (UniqueName: \"kubernetes.io/projected/6df3463f-6f50-46f1-9d22-b674225b7da9-kube-api-access-8l6br\") pod \"6df3463f-6f50-46f1-9d22-b674225b7da9\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.359560 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-sg-core-conf-yaml\") pod \"6df3463f-6f50-46f1-9d22-b674225b7da9\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.359810 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-ceilometer-tls-certs\") pod \"6df3463f-6f50-46f1-9d22-b674225b7da9\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.359985 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6df3463f-6f50-46f1-9d22-b674225b7da9-run-httpd\") pod \"6df3463f-6f50-46f1-9d22-b674225b7da9\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.360122 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-scripts\") pod \"6df3463f-6f50-46f1-9d22-b674225b7da9\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.360284 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6df3463f-6f50-46f1-9d22-b674225b7da9-log-httpd\") pod \"6df3463f-6f50-46f1-9d22-b674225b7da9\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.360456 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-config-data\") pod \"6df3463f-6f50-46f1-9d22-b674225b7da9\" (UID: \"6df3463f-6f50-46f1-9d22-b674225b7da9\") " Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.360500 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6df3463f-6f50-46f1-9d22-b674225b7da9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6df3463f-6f50-46f1-9d22-b674225b7da9" (UID: "6df3463f-6f50-46f1-9d22-b674225b7da9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.360560 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6df3463f-6f50-46f1-9d22-b674225b7da9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6df3463f-6f50-46f1-9d22-b674225b7da9" (UID: "6df3463f-6f50-46f1-9d22-b674225b7da9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.361562 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6df3463f-6f50-46f1-9d22-b674225b7da9-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.361665 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6df3463f-6f50-46f1-9d22-b674225b7da9-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.365635 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-scripts" (OuterVolumeSpecName: "scripts") pod "6df3463f-6f50-46f1-9d22-b674225b7da9" (UID: "6df3463f-6f50-46f1-9d22-b674225b7da9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.365643 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6df3463f-6f50-46f1-9d22-b674225b7da9-kube-api-access-8l6br" (OuterVolumeSpecName: "kube-api-access-8l6br") pod "6df3463f-6f50-46f1-9d22-b674225b7da9" (UID: "6df3463f-6f50-46f1-9d22-b674225b7da9"). InnerVolumeSpecName "kube-api-access-8l6br". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.402811 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6df3463f-6f50-46f1-9d22-b674225b7da9" (UID: "6df3463f-6f50-46f1-9d22-b674225b7da9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.440032 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6df3463f-6f50-46f1-9d22-b674225b7da9" (UID: "6df3463f-6f50-46f1-9d22-b674225b7da9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.440297 4764 generic.go:334] "Generic (PLEG): container finished" podID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerID="33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29" exitCode=0 Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.440323 4764 generic.go:334] "Generic (PLEG): container finished" podID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerID="fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77" exitCode=2 Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.440331 4764 generic.go:334] "Generic (PLEG): container finished" podID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerID="9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44" exitCode=0 Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.440340 4764 generic.go:334] "Generic (PLEG): container finished" podID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerID="0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca" exitCode=0 Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.440362 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6df3463f-6f50-46f1-9d22-b674225b7da9","Type":"ContainerDied","Data":"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29"} Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.440390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6df3463f-6f50-46f1-9d22-b674225b7da9","Type":"ContainerDied","Data":"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77"} Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.440402 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6df3463f-6f50-46f1-9d22-b674225b7da9","Type":"ContainerDied","Data":"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44"} Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.440415 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6df3463f-6f50-46f1-9d22-b674225b7da9","Type":"ContainerDied","Data":"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca"} Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.440426 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6df3463f-6f50-46f1-9d22-b674225b7da9","Type":"ContainerDied","Data":"c222fb854bb919bef5c950806e5a893b07d48e4bf8a85947ced6e548238fdda8"} Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.440446 4764 scope.go:117] "RemoveContainer" containerID="33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.440584 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.445488 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6df3463f-6f50-46f1-9d22-b674225b7da9" (UID: "6df3463f-6f50-46f1-9d22-b674225b7da9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.463721 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.463755 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.463767 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.463776 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l6br\" (UniqueName: \"kubernetes.io/projected/6df3463f-6f50-46f1-9d22-b674225b7da9-kube-api-access-8l6br\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.463787 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.464456 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-config-data" (OuterVolumeSpecName: "config-data") pod "6df3463f-6f50-46f1-9d22-b674225b7da9" (UID: "6df3463f-6f50-46f1-9d22-b674225b7da9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.468164 4764 scope.go:117] "RemoveContainer" containerID="fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.489113 4764 scope.go:117] "RemoveContainer" containerID="9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.510986 4764 scope.go:117] "RemoveContainer" containerID="0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.535537 4764 scope.go:117] "RemoveContainer" containerID="33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29" Oct 01 16:55:44 crc kubenswrapper[4764]: E1001 16:55:44.536259 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29\": container with ID starting with 33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29 not found: ID does not exist" containerID="33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.536339 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29"} err="failed to get container status \"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29\": rpc error: code = NotFound desc = could not find container \"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29\": container with ID starting with 33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29 not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.536389 4764 scope.go:117] "RemoveContainer" containerID="fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77" Oct 01 16:55:44 crc kubenswrapper[4764]: E1001 16:55:44.536857 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77\": container with ID starting with fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77 not found: ID does not exist" containerID="fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.536915 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77"} err="failed to get container status \"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77\": rpc error: code = NotFound desc = could not find container \"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77\": container with ID starting with fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77 not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.536942 4764 scope.go:117] "RemoveContainer" containerID="9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44" Oct 01 16:55:44 crc kubenswrapper[4764]: E1001 16:55:44.537822 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44\": container with ID starting with 9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44 not found: ID does not exist" containerID="9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.537847 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44"} err="failed to get container status \"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44\": rpc error: code = NotFound desc = could not find container \"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44\": container with ID starting with 9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44 not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.537863 4764 scope.go:117] "RemoveContainer" containerID="0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca" Oct 01 16:55:44 crc kubenswrapper[4764]: E1001 16:55:44.538455 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca\": container with ID starting with 0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca not found: ID does not exist" containerID="0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.538486 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca"} err="failed to get container status \"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca\": rpc error: code = NotFound desc = could not find container \"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca\": container with ID starting with 0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.538502 4764 scope.go:117] "RemoveContainer" containerID="33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.538807 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29"} err="failed to get container status \"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29\": rpc error: code = NotFound desc = could not find container \"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29\": container with ID starting with 33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29 not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.538826 4764 scope.go:117] "RemoveContainer" containerID="fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.539065 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77"} err="failed to get container status \"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77\": rpc error: code = NotFound desc = could not find container \"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77\": container with ID starting with fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77 not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.539085 4764 scope.go:117] "RemoveContainer" containerID="9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.539299 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44"} err="failed to get container status \"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44\": rpc error: code = NotFound desc = could not find container \"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44\": container with ID starting with 9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44 not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.539342 4764 scope.go:117] "RemoveContainer" containerID="0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.539553 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca"} err="failed to get container status \"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca\": rpc error: code = NotFound desc = could not find container \"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca\": container with ID starting with 0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.539569 4764 scope.go:117] "RemoveContainer" containerID="33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.539734 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29"} err="failed to get container status \"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29\": rpc error: code = NotFound desc = could not find container \"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29\": container with ID starting with 33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29 not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.539751 4764 scope.go:117] "RemoveContainer" containerID="fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.539910 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77"} err="failed to get container status \"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77\": rpc error: code = NotFound desc = could not find container \"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77\": container with ID starting with fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77 not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.539927 4764 scope.go:117] "RemoveContainer" containerID="9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.540086 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44"} err="failed to get container status \"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44\": rpc error: code = NotFound desc = could not find container \"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44\": container with ID starting with 9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44 not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.540106 4764 scope.go:117] "RemoveContainer" containerID="0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.540250 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca"} err="failed to get container status \"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca\": rpc error: code = NotFound desc = could not find container \"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca\": container with ID starting with 0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.540270 4764 scope.go:117] "RemoveContainer" containerID="33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.540457 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29"} err="failed to get container status \"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29\": rpc error: code = NotFound desc = could not find container \"33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29\": container with ID starting with 33c8285005a9ddc31b2b2a9f673141fcaea237fe8a50e42f61cf636450e3fc29 not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.540477 4764 scope.go:117] "RemoveContainer" containerID="fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.540614 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77"} err="failed to get container status \"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77\": rpc error: code = NotFound desc = could not find container \"fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77\": container with ID starting with fde98daa2024375b1d4dcfaa7b1d7bf2f9e3cb78608fc3ca8f81045001abde77 not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.540631 4764 scope.go:117] "RemoveContainer" containerID="9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.540756 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44"} err="failed to get container status \"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44\": rpc error: code = NotFound desc = could not find container \"9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44\": container with ID starting with 9b283c28e0004574bb104f30a821e4520f90547c5c49a70d222a385e7bffbf44 not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.540772 4764 scope.go:117] "RemoveContainer" containerID="0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.540894 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca"} err="failed to get container status \"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca\": rpc error: code = NotFound desc = could not find container \"0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca\": container with ID starting with 0b1918c59a7945035dbf1e872f42a921d6384c59af764ddc7bc31ac9c7e638ca not found: ID does not exist" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.567091 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6df3463f-6f50-46f1-9d22-b674225b7da9-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.781423 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.796230 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.811334 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:55:44 crc kubenswrapper[4764]: E1001 16:55:44.811957 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="sg-core" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.811991 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="sg-core" Oct 01 16:55:44 crc kubenswrapper[4764]: E1001 16:55:44.812030 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="proxy-httpd" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.812066 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="proxy-httpd" Oct 01 16:55:44 crc kubenswrapper[4764]: E1001 16:55:44.812114 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" containerName="dnsmasq-dns" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.812127 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" containerName="dnsmasq-dns" Oct 01 16:55:44 crc kubenswrapper[4764]: E1001 16:55:44.812150 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="ceilometer-notification-agent" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.812161 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="ceilometer-notification-agent" Oct 01 16:55:44 crc kubenswrapper[4764]: E1001 16:55:44.812178 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="ceilometer-central-agent" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.812189 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="ceilometer-central-agent" Oct 01 16:55:44 crc kubenswrapper[4764]: E1001 16:55:44.812213 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" containerName="init" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.812222 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" containerName="init" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.812555 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="ceilometer-central-agent" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.812598 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="ceilometer-notification-agent" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.812625 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="sg-core" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.812655 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" containerName="proxy-httpd" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.812670 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff2f77d9-2e42-4764-b0b5-6a4b13877ed4" containerName="dnsmasq-dns" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.815008 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.817423 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.817674 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.817853 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.823912 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.873843 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51273dda-10be-4519-adeb-992fa5936387-run-httpd\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.873908 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.874228 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-scripts\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.874363 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-config-data\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.874428 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.874505 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.874591 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86bdw\" (UniqueName: \"kubernetes.io/projected/51273dda-10be-4519-adeb-992fa5936387-kube-api-access-86bdw\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.874635 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51273dda-10be-4519-adeb-992fa5936387-log-httpd\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.977733 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-scripts\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.977846 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-config-data\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.977900 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.977951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.978017 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86bdw\" (UniqueName: \"kubernetes.io/projected/51273dda-10be-4519-adeb-992fa5936387-kube-api-access-86bdw\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.978091 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51273dda-10be-4519-adeb-992fa5936387-log-httpd\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.978245 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51273dda-10be-4519-adeb-992fa5936387-run-httpd\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.978298 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.978895 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51273dda-10be-4519-adeb-992fa5936387-log-httpd\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.978902 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51273dda-10be-4519-adeb-992fa5936387-run-httpd\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.983316 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-scripts\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.983471 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.984011 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-config-data\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.989026 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:44 crc kubenswrapper[4764]: I1001 16:55:44.991847 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51273dda-10be-4519-adeb-992fa5936387-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:45 crc kubenswrapper[4764]: I1001 16:55:45.002581 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86bdw\" (UniqueName: \"kubernetes.io/projected/51273dda-10be-4519-adeb-992fa5936387-kube-api-access-86bdw\") pod \"ceilometer-0\" (UID: \"51273dda-10be-4519-adeb-992fa5936387\") " pod="openstack/ceilometer-0" Oct 01 16:55:45 crc kubenswrapper[4764]: I1001 16:55:45.140948 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 16:55:45 crc kubenswrapper[4764]: I1001 16:55:45.219131 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:55:45 crc kubenswrapper[4764]: I1001 16:55:45.497132 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:55:45 crc kubenswrapper[4764]: W1001 16:55:45.603599 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51273dda_10be_4519_adeb_992fa5936387.slice/crio-087046cbe46488ef6d1dc074d221bced87a5d0d17b441840ed25cfafbe03deba WatchSource:0}: Error finding container 087046cbe46488ef6d1dc074d221bced87a5d0d17b441840ed25cfafbe03deba: Status 404 returned error can't find the container with id 087046cbe46488ef6d1dc074d221bced87a5d0d17b441840ed25cfafbe03deba Oct 01 16:55:45 crc kubenswrapper[4764]: I1001 16:55:45.607567 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 16:55:45 crc kubenswrapper[4764]: I1001 16:55:45.737211 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6df3463f-6f50-46f1-9d22-b674225b7da9" path="/var/lib/kubelet/pods/6df3463f-6f50-46f1-9d22-b674225b7da9/volumes" Oct 01 16:55:46 crc kubenswrapper[4764]: I1001 16:55:46.473787 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51273dda-10be-4519-adeb-992fa5936387","Type":"ContainerStarted","Data":"087046cbe46488ef6d1dc074d221bced87a5d0d17b441840ed25cfafbe03deba"} Oct 01 16:55:46 crc kubenswrapper[4764]: I1001 16:55:46.685196 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 01 16:55:47 crc kubenswrapper[4764]: I1001 16:55:47.435038 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-554f5d45dd-s9w79" Oct 01 16:55:47 crc kubenswrapper[4764]: I1001 16:55:47.469686 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:55:47 crc kubenswrapper[4764]: I1001 16:55:47.507276 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b86858896-mnqsl"] Oct 01 16:55:47 crc kubenswrapper[4764]: I1001 16:55:47.519552 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b86858896-mnqsl" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerName="horizon-log" containerID="cri-o://2d7c85ff30844100ee9e0edf53bb3971f83baa32659144e1d89708acae81b02d" gracePeriod=30 Oct 01 16:55:47 crc kubenswrapper[4764]: I1001 16:55:47.519681 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51273dda-10be-4519-adeb-992fa5936387","Type":"ContainerStarted","Data":"c7c8fe46b3d0fd9f8429f84c1834e2742f3076a5ca09746e454ff3fafbec9d90"} Oct 01 16:55:47 crc kubenswrapper[4764]: I1001 16:55:47.520093 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b86858896-mnqsl" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerName="horizon" containerID="cri-o://4e9e3b2c976d63754df1aa1411440231cac6f40aea8592e89bf8193095bcfd0a" gracePeriod=30 Oct 01 16:55:48 crc kubenswrapper[4764]: I1001 16:55:48.531090 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51273dda-10be-4519-adeb-992fa5936387","Type":"ContainerStarted","Data":"78de7d120c5231ce13e55b2a8ba2d86839e3c536d7396ad42702750e1e08aa32"} Oct 01 16:55:49 crc kubenswrapper[4764]: I1001 16:55:49.337309 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 01 16:55:49 crc kubenswrapper[4764]: I1001 16:55:49.412346 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 16:55:49 crc kubenswrapper[4764]: I1001 16:55:49.430223 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 01 16:55:49 crc kubenswrapper[4764]: I1001 16:55:49.504026 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 16:55:49 crc kubenswrapper[4764]: I1001 16:55:49.547847 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="d0d66a21-186e-484a-b560-31d2fabb01b1" containerName="manila-scheduler" containerID="cri-o://6ee5d01f3e771b82e6710b06dbe360e8dd9d52a199280b316de10e34177675c7" gracePeriod=30 Oct 01 16:55:49 crc kubenswrapper[4764]: I1001 16:55:49.548487 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51273dda-10be-4519-adeb-992fa5936387","Type":"ContainerStarted","Data":"97297b93a5fa502cf5b252d96b97ab1790b1ceca7888dfa05a81ee44f0b81fef"} Oct 01 16:55:49 crc kubenswrapper[4764]: I1001 16:55:49.548556 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="d0d66a21-186e-484a-b560-31d2fabb01b1" containerName="probe" containerID="cri-o://e0228c4449e380c4c919ba5daf54fc18684d6af86252d2ffe795a01c0d656bc2" gracePeriod=30 Oct 01 16:55:49 crc kubenswrapper[4764]: I1001 16:55:49.548702 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="0f3f2b73-15c6-4249-9206-7951bfd2e1a7" containerName="manila-share" containerID="cri-o://268a56a3776ad5abf45ff01451bfaf631cdffa6b439721b5a272f55e3321d8d9" gracePeriod=30 Oct 01 16:55:49 crc kubenswrapper[4764]: I1001 16:55:49.548755 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="0f3f2b73-15c6-4249-9206-7951bfd2e1a7" containerName="probe" containerID="cri-o://00b0ce2e9c7a0ee4bd98d8086d80e972394da2284f9823baa3132bcee98bce80" gracePeriod=30 Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.565678 4764 generic.go:334] "Generic (PLEG): container finished" podID="0f3f2b73-15c6-4249-9206-7951bfd2e1a7" containerID="00b0ce2e9c7a0ee4bd98d8086d80e972394da2284f9823baa3132bcee98bce80" exitCode=0 Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.566439 4764 generic.go:334] "Generic (PLEG): container finished" podID="0f3f2b73-15c6-4249-9206-7951bfd2e1a7" containerID="268a56a3776ad5abf45ff01451bfaf631cdffa6b439721b5a272f55e3321d8d9" exitCode=1 Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.565746 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0f3f2b73-15c6-4249-9206-7951bfd2e1a7","Type":"ContainerDied","Data":"00b0ce2e9c7a0ee4bd98d8086d80e972394da2284f9823baa3132bcee98bce80"} Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.566542 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0f3f2b73-15c6-4249-9206-7951bfd2e1a7","Type":"ContainerDied","Data":"268a56a3776ad5abf45ff01451bfaf631cdffa6b439721b5a272f55e3321d8d9"} Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.566557 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"0f3f2b73-15c6-4249-9206-7951bfd2e1a7","Type":"ContainerDied","Data":"44a53d95a75b1dd569824e8d9dcf3be9548aab4bbe6499ee63f51f77216f4294"} Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.566570 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44a53d95a75b1dd569824e8d9dcf3be9548aab4bbe6499ee63f51f77216f4294" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.569122 4764 generic.go:334] "Generic (PLEG): container finished" podID="d0d66a21-186e-484a-b560-31d2fabb01b1" containerID="e0228c4449e380c4c919ba5daf54fc18684d6af86252d2ffe795a01c0d656bc2" exitCode=0 Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.569216 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d0d66a21-186e-484a-b560-31d2fabb01b1","Type":"ContainerDied","Data":"e0228c4449e380c4c919ba5daf54fc18684d6af86252d2ffe795a01c0d656bc2"} Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.678748 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.736323 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-etc-machine-id\") pod \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.736517 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9c8c\" (UniqueName: \"kubernetes.io/projected/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-kube-api-access-g9c8c\") pod \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.737268 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-config-data-custom\") pod \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.737280 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0f3f2b73-15c6-4249-9206-7951bfd2e1a7" (UID: "0f3f2b73-15c6-4249-9206-7951bfd2e1a7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.737306 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-var-lib-manila\") pod \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.737402 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "0f3f2b73-15c6-4249-9206-7951bfd2e1a7" (UID: "0f3f2b73-15c6-4249-9206-7951bfd2e1a7"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.737522 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-scripts\") pod \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.737618 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-config-data\") pod \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.737725 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-combined-ca-bundle\") pod \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.737833 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-ceph\") pod \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\" (UID: \"0f3f2b73-15c6-4249-9206-7951bfd2e1a7\") " Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.739092 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.739111 4764 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-var-lib-manila\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.744524 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-kube-api-access-g9c8c" (OuterVolumeSpecName: "kube-api-access-g9c8c") pod "0f3f2b73-15c6-4249-9206-7951bfd2e1a7" (UID: "0f3f2b73-15c6-4249-9206-7951bfd2e1a7"). InnerVolumeSpecName "kube-api-access-g9c8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.745126 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-ceph" (OuterVolumeSpecName: "ceph") pod "0f3f2b73-15c6-4249-9206-7951bfd2e1a7" (UID: "0f3f2b73-15c6-4249-9206-7951bfd2e1a7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.745315 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0f3f2b73-15c6-4249-9206-7951bfd2e1a7" (UID: "0f3f2b73-15c6-4249-9206-7951bfd2e1a7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.747502 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-scripts" (OuterVolumeSpecName: "scripts") pod "0f3f2b73-15c6-4249-9206-7951bfd2e1a7" (UID: "0f3f2b73-15c6-4249-9206-7951bfd2e1a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.806511 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f3f2b73-15c6-4249-9206-7951bfd2e1a7" (UID: "0f3f2b73-15c6-4249-9206-7951bfd2e1a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.843646 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.846011 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.846063 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-ceph\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.846078 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9c8c\" (UniqueName: \"kubernetes.io/projected/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-kube-api-access-g9c8c\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.846092 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.858257 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-config-data" (OuterVolumeSpecName: "config-data") pod "0f3f2b73-15c6-4249-9206-7951bfd2e1a7" (UID: "0f3f2b73-15c6-4249-9206-7951bfd2e1a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:50 crc kubenswrapper[4764]: I1001 16:55:50.947701 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3f2b73-15c6-4249-9206-7951bfd2e1a7-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.589523 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51273dda-10be-4519-adeb-992fa5936387","Type":"ContainerStarted","Data":"86fec18432c988e05db4e95d3cb5d49a4ea05de60886346c418db3cfb8be553a"} Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.593281 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.600608 4764 generic.go:334] "Generic (PLEG): container finished" podID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerID="4e9e3b2c976d63754df1aa1411440231cac6f40aea8592e89bf8193095bcfd0a" exitCode=0 Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.600864 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.604432 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b86858896-mnqsl" event={"ID":"05cbc202-6448-46cc-85d0-d4b432506ed5","Type":"ContainerDied","Data":"4e9e3b2c976d63754df1aa1411440231cac6f40aea8592e89bf8193095bcfd0a"} Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.633335 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.715623583 podStartE2EDuration="7.633307699s" podCreationTimestamp="2025-10-01 16:55:44 +0000 UTC" firstStartedPulling="2025-10-01 16:55:45.606295156 +0000 UTC m=+3208.605942001" lastFinishedPulling="2025-10-01 16:55:50.523979262 +0000 UTC m=+3213.523626117" observedRunningTime="2025-10-01 16:55:51.626357758 +0000 UTC m=+3214.626004623" watchObservedRunningTime="2025-10-01 16:55:51.633307699 +0000 UTC m=+3214.632954554" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.662499 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.673952 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.700090 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 16:55:51 crc kubenswrapper[4764]: E1001 16:55:51.700679 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3f2b73-15c6-4249-9206-7951bfd2e1a7" containerName="probe" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.700705 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3f2b73-15c6-4249-9206-7951bfd2e1a7" containerName="probe" Oct 01 16:55:51 crc kubenswrapper[4764]: E1001 16:55:51.700737 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3f2b73-15c6-4249-9206-7951bfd2e1a7" containerName="manila-share" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.700748 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3f2b73-15c6-4249-9206-7951bfd2e1a7" containerName="manila-share" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.701023 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3f2b73-15c6-4249-9206-7951bfd2e1a7" containerName="probe" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.701085 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3f2b73-15c6-4249-9206-7951bfd2e1a7" containerName="manila-share" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.702594 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.705315 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.738984 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3f2b73-15c6-4249-9206-7951bfd2e1a7" path="/var/lib/kubelet/pods/0f3f2b73-15c6-4249-9206-7951bfd2e1a7/volumes" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.739816 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.776343 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ca31e39f-7fcb-4389-882d-cfa2d4491df4-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.776406 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca31e39f-7fcb-4389-882d-cfa2d4491df4-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.776446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca31e39f-7fcb-4389-882d-cfa2d4491df4-config-data\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.776465 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca31e39f-7fcb-4389-882d-cfa2d4491df4-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.776594 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca31e39f-7fcb-4389-882d-cfa2d4491df4-scripts\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.776844 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh2xx\" (UniqueName: \"kubernetes.io/projected/ca31e39f-7fcb-4389-882d-cfa2d4491df4-kube-api-access-sh2xx\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.776919 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ca31e39f-7fcb-4389-882d-cfa2d4491df4-ceph\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.777026 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca31e39f-7fcb-4389-882d-cfa2d4491df4-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.878616 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca31e39f-7fcb-4389-882d-cfa2d4491df4-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.878760 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ca31e39f-7fcb-4389-882d-cfa2d4491df4-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.878827 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca31e39f-7fcb-4389-882d-cfa2d4491df4-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.878899 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca31e39f-7fcb-4389-882d-cfa2d4491df4-config-data\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.878922 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca31e39f-7fcb-4389-882d-cfa2d4491df4-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.878941 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ca31e39f-7fcb-4389-882d-cfa2d4491df4-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.878983 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca31e39f-7fcb-4389-882d-cfa2d4491df4-scripts\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.879093 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh2xx\" (UniqueName: \"kubernetes.io/projected/ca31e39f-7fcb-4389-882d-cfa2d4491df4-kube-api-access-sh2xx\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.879090 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/ca31e39f-7fcb-4389-882d-cfa2d4491df4-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.879137 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ca31e39f-7fcb-4389-882d-cfa2d4491df4-ceph\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.883977 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca31e39f-7fcb-4389-882d-cfa2d4491df4-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.884876 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ca31e39f-7fcb-4389-882d-cfa2d4491df4-ceph\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.889880 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca31e39f-7fcb-4389-882d-cfa2d4491df4-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.896522 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca31e39f-7fcb-4389-882d-cfa2d4491df4-scripts\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.904501 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh2xx\" (UniqueName: \"kubernetes.io/projected/ca31e39f-7fcb-4389-882d-cfa2d4491df4-kube-api-access-sh2xx\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:51 crc kubenswrapper[4764]: I1001 16:55:51.905323 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca31e39f-7fcb-4389-882d-cfa2d4491df4-config-data\") pod \"manila-share-share1-0\" (UID: \"ca31e39f-7fcb-4389-882d-cfa2d4491df4\") " pod="openstack/manila-share-share1-0" Oct 01 16:55:52 crc kubenswrapper[4764]: I1001 16:55:52.020213 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 01 16:55:52 crc kubenswrapper[4764]: I1001 16:55:52.097155 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b86858896-mnqsl" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.255:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.255:8443: connect: connection refused" Oct 01 16:55:52 crc kubenswrapper[4764]: I1001 16:55:52.621212 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 01 16:55:52 crc kubenswrapper[4764]: I1001 16:55:52.721834 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:55:52 crc kubenswrapper[4764]: E1001 16:55:52.722562 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.636812 4764 generic.go:334] "Generic (PLEG): container finished" podID="d0d66a21-186e-484a-b560-31d2fabb01b1" containerID="6ee5d01f3e771b82e6710b06dbe360e8dd9d52a199280b316de10e34177675c7" exitCode=0 Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.637498 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d0d66a21-186e-484a-b560-31d2fabb01b1","Type":"ContainerDied","Data":"6ee5d01f3e771b82e6710b06dbe360e8dd9d52a199280b316de10e34177675c7"} Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.637527 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"d0d66a21-186e-484a-b560-31d2fabb01b1","Type":"ContainerDied","Data":"ce95d46ef5b2473173868748f6eb9dfad57f92fe8c9a26baf780b59ac7c508cd"} Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.637538 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce95d46ef5b2473173868748f6eb9dfad57f92fe8c9a26baf780b59ac7c508cd" Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.642309 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ca31e39f-7fcb-4389-882d-cfa2d4491df4","Type":"ContainerStarted","Data":"4af7dc05a3ee2f1eb99d05a78bd8181c5d7846dd5b7250ca2ae75bf74069c2ea"} Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.642336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ca31e39f-7fcb-4389-882d-cfa2d4491df4","Type":"ContainerStarted","Data":"b3ded59090aa39ec061bc9c080142d54383695a500a4896d33d88e0b595dfb10"} Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.718887 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.829222 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-config-data-custom\") pod \"d0d66a21-186e-484a-b560-31d2fabb01b1\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.829582 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-combined-ca-bundle\") pod \"d0d66a21-186e-484a-b560-31d2fabb01b1\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.829676 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-scripts\") pod \"d0d66a21-186e-484a-b560-31d2fabb01b1\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.829697 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-config-data\") pod \"d0d66a21-186e-484a-b560-31d2fabb01b1\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.829716 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0d66a21-186e-484a-b560-31d2fabb01b1-etc-machine-id\") pod \"d0d66a21-186e-484a-b560-31d2fabb01b1\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.829772 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxcc6\" (UniqueName: \"kubernetes.io/projected/d0d66a21-186e-484a-b560-31d2fabb01b1-kube-api-access-dxcc6\") pod \"d0d66a21-186e-484a-b560-31d2fabb01b1\" (UID: \"d0d66a21-186e-484a-b560-31d2fabb01b1\") " Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.830246 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0d66a21-186e-484a-b560-31d2fabb01b1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d0d66a21-186e-484a-b560-31d2fabb01b1" (UID: "d0d66a21-186e-484a-b560-31d2fabb01b1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.830423 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0d66a21-186e-484a-b560-31d2fabb01b1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.835457 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d0d66a21-186e-484a-b560-31d2fabb01b1" (UID: "d0d66a21-186e-484a-b560-31d2fabb01b1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.842337 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-scripts" (OuterVolumeSpecName: "scripts") pod "d0d66a21-186e-484a-b560-31d2fabb01b1" (UID: "d0d66a21-186e-484a-b560-31d2fabb01b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.851149 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0d66a21-186e-484a-b560-31d2fabb01b1-kube-api-access-dxcc6" (OuterVolumeSpecName: "kube-api-access-dxcc6") pod "d0d66a21-186e-484a-b560-31d2fabb01b1" (UID: "d0d66a21-186e-484a-b560-31d2fabb01b1"). InnerVolumeSpecName "kube-api-access-dxcc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.914469 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0d66a21-186e-484a-b560-31d2fabb01b1" (UID: "d0d66a21-186e-484a-b560-31d2fabb01b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.932916 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.932949 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.932959 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.932968 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxcc6\" (UniqueName: \"kubernetes.io/projected/d0d66a21-186e-484a-b560-31d2fabb01b1-kube-api-access-dxcc6\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:53 crc kubenswrapper[4764]: I1001 16:55:53.975425 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-config-data" (OuterVolumeSpecName: "config-data") pod "d0d66a21-186e-484a-b560-31d2fabb01b1" (UID: "d0d66a21-186e-484a-b560-31d2fabb01b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.034198 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d66a21-186e-484a-b560-31d2fabb01b1-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.538309 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.656966 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.658101 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"ca31e39f-7fcb-4389-882d-cfa2d4491df4","Type":"ContainerStarted","Data":"132531a0d5bd1a86e5097e5328d3258554a81df4e28c3d9fe414f0b7a4506567"} Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.692130 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.692113331 podStartE2EDuration="3.692113331s" podCreationTimestamp="2025-10-01 16:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:55:54.686369339 +0000 UTC m=+3217.686016174" watchObservedRunningTime="2025-10-01 16:55:54.692113331 +0000 UTC m=+3217.691760166" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.713030 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.721406 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.750667 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 16:55:54 crc kubenswrapper[4764]: E1001 16:55:54.751112 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d66a21-186e-484a-b560-31d2fabb01b1" containerName="probe" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.751130 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d66a21-186e-484a-b560-31d2fabb01b1" containerName="probe" Oct 01 16:55:54 crc kubenswrapper[4764]: E1001 16:55:54.751155 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d66a21-186e-484a-b560-31d2fabb01b1" containerName="manila-scheduler" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.751163 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d66a21-186e-484a-b560-31d2fabb01b1" containerName="manila-scheduler" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.751326 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d66a21-186e-484a-b560-31d2fabb01b1" containerName="probe" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.751342 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d66a21-186e-484a-b560-31d2fabb01b1" containerName="manila-scheduler" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.752452 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.754405 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.771381 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.848551 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-config-data\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.848652 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxxbc\" (UniqueName: \"kubernetes.io/projected/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-kube-api-access-fxxbc\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.848675 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.848703 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-scripts\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.849205 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.849365 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.950949 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.951188 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.951293 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-config-data\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.951301 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.951611 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxxbc\" (UniqueName: \"kubernetes.io/projected/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-kube-api-access-fxxbc\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.951659 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.951741 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-scripts\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.957585 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.958832 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-config-data\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.958992 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-scripts\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.962379 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:54 crc kubenswrapper[4764]: I1001 16:55:54.972130 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxxbc\" (UniqueName: \"kubernetes.io/projected/0f9e419b-d3d5-4813-a276-aac1f18ef4f4-kube-api-access-fxxbc\") pod \"manila-scheduler-0\" (UID: \"0f9e419b-d3d5-4813-a276-aac1f18ef4f4\") " pod="openstack/manila-scheduler-0" Oct 01 16:55:55 crc kubenswrapper[4764]: I1001 16:55:55.081785 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 01 16:55:55 crc kubenswrapper[4764]: I1001 16:55:55.430976 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 01 16:55:55 crc kubenswrapper[4764]: I1001 16:55:55.670349 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"0f9e419b-d3d5-4813-a276-aac1f18ef4f4","Type":"ContainerStarted","Data":"4adaf72c5e0cc70c93ce3711967a5ae89b261d946db1e788e9041d195fd5f52f"} Oct 01 16:55:55 crc kubenswrapper[4764]: I1001 16:55:55.737611 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0d66a21-186e-484a-b560-31d2fabb01b1" path="/var/lib/kubelet/pods/d0d66a21-186e-484a-b560-31d2fabb01b1/volumes" Oct 01 16:55:56 crc kubenswrapper[4764]: I1001 16:55:56.683335 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"0f9e419b-d3d5-4813-a276-aac1f18ef4f4","Type":"ContainerStarted","Data":"7bb4f2a8ef0bba00fdbe66e03900d807460c1ef21c4679867db709d616818360"} Oct 01 16:55:57 crc kubenswrapper[4764]: I1001 16:55:57.698749 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"0f9e419b-d3d5-4813-a276-aac1f18ef4f4","Type":"ContainerStarted","Data":"f71988f7d44ade43ae84968de8b791b8dac1d4a3ed38514ebbe912a12a7cca50"} Oct 01 16:55:57 crc kubenswrapper[4764]: I1001 16:55:57.726388 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.7263666779999998 podStartE2EDuration="3.726366678s" podCreationTimestamp="2025-10-01 16:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:55:57.721783665 +0000 UTC m=+3220.721430600" watchObservedRunningTime="2025-10-01 16:55:57.726366678 +0000 UTC m=+3220.726013513" Oct 01 16:56:02 crc kubenswrapper[4764]: I1001 16:56:02.020581 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 01 16:56:02 crc kubenswrapper[4764]: I1001 16:56:02.097205 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b86858896-mnqsl" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.255:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.255:8443: connect: connection refused" Oct 01 16:56:05 crc kubenswrapper[4764]: I1001 16:56:05.082808 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 01 16:56:05 crc kubenswrapper[4764]: I1001 16:56:05.721590 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:56:05 crc kubenswrapper[4764]: E1001 16:56:05.721911 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:56:12 crc kubenswrapper[4764]: I1001 16:56:12.097526 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b86858896-mnqsl" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.255:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.255:8443: connect: connection refused" Oct 01 16:56:12 crc kubenswrapper[4764]: I1001 16:56:12.098200 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:56:13 crc kubenswrapper[4764]: I1001 16:56:13.467226 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 01 16:56:15 crc kubenswrapper[4764]: I1001 16:56:15.156256 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 16:56:16 crc kubenswrapper[4764]: I1001 16:56:16.882460 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 01 16:56:17 crc kubenswrapper[4764]: I1001 16:56:17.728724 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:56:17 crc kubenswrapper[4764]: E1001 16:56:17.729188 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:56:17 crc kubenswrapper[4764]: I1001 16:56:17.947564 4764 generic.go:334] "Generic (PLEG): container finished" podID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerID="2d7c85ff30844100ee9e0edf53bb3971f83baa32659144e1d89708acae81b02d" exitCode=137 Oct 01 16:56:17 crc kubenswrapper[4764]: I1001 16:56:17.947809 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b86858896-mnqsl" event={"ID":"05cbc202-6448-46cc-85d0-d4b432506ed5","Type":"ContainerDied","Data":"2d7c85ff30844100ee9e0edf53bb3971f83baa32659144e1d89708acae81b02d"} Oct 01 16:56:17 crc kubenswrapper[4764]: I1001 16:56:17.947939 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b86858896-mnqsl" event={"ID":"05cbc202-6448-46cc-85d0-d4b432506ed5","Type":"ContainerDied","Data":"77f129cc3ba76f3dafae42d60132b36a67146c0781067930d937ce62fc83f092"} Oct 01 16:56:17 crc kubenswrapper[4764]: I1001 16:56:17.947970 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:56:17 crc kubenswrapper[4764]: I1001 16:56:17.947980 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77f129cc3ba76f3dafae42d60132b36a67146c0781067930d937ce62fc83f092" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.097458 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-horizon-secret-key\") pod \"05cbc202-6448-46cc-85d0-d4b432506ed5\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.097825 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05cbc202-6448-46cc-85d0-d4b432506ed5-scripts\") pod \"05cbc202-6448-46cc-85d0-d4b432506ed5\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.097872 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-combined-ca-bundle\") pod \"05cbc202-6448-46cc-85d0-d4b432506ed5\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.097898 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05cbc202-6448-46cc-85d0-d4b432506ed5-config-data\") pod \"05cbc202-6448-46cc-85d0-d4b432506ed5\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.098010 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05cbc202-6448-46cc-85d0-d4b432506ed5-logs\") pod \"05cbc202-6448-46cc-85d0-d4b432506ed5\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.098119 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-horizon-tls-certs\") pod \"05cbc202-6448-46cc-85d0-d4b432506ed5\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.098250 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngz7d\" (UniqueName: \"kubernetes.io/projected/05cbc202-6448-46cc-85d0-d4b432506ed5-kube-api-access-ngz7d\") pod \"05cbc202-6448-46cc-85d0-d4b432506ed5\" (UID: \"05cbc202-6448-46cc-85d0-d4b432506ed5\") " Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.098440 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05cbc202-6448-46cc-85d0-d4b432506ed5-logs" (OuterVolumeSpecName: "logs") pod "05cbc202-6448-46cc-85d0-d4b432506ed5" (UID: "05cbc202-6448-46cc-85d0-d4b432506ed5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.098774 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05cbc202-6448-46cc-85d0-d4b432506ed5-logs\") on node \"crc\" DevicePath \"\"" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.107219 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "05cbc202-6448-46cc-85d0-d4b432506ed5" (UID: "05cbc202-6448-46cc-85d0-d4b432506ed5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.110274 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05cbc202-6448-46cc-85d0-d4b432506ed5-kube-api-access-ngz7d" (OuterVolumeSpecName: "kube-api-access-ngz7d") pod "05cbc202-6448-46cc-85d0-d4b432506ed5" (UID: "05cbc202-6448-46cc-85d0-d4b432506ed5"). InnerVolumeSpecName "kube-api-access-ngz7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.121894 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05cbc202-6448-46cc-85d0-d4b432506ed5-config-data" (OuterVolumeSpecName: "config-data") pod "05cbc202-6448-46cc-85d0-d4b432506ed5" (UID: "05cbc202-6448-46cc-85d0-d4b432506ed5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.126033 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05cbc202-6448-46cc-85d0-d4b432506ed5-scripts" (OuterVolumeSpecName: "scripts") pod "05cbc202-6448-46cc-85d0-d4b432506ed5" (UID: "05cbc202-6448-46cc-85d0-d4b432506ed5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.132897 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05cbc202-6448-46cc-85d0-d4b432506ed5" (UID: "05cbc202-6448-46cc-85d0-d4b432506ed5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.163980 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "05cbc202-6448-46cc-85d0-d4b432506ed5" (UID: "05cbc202-6448-46cc-85d0-d4b432506ed5"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.202100 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.202149 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngz7d\" (UniqueName: \"kubernetes.io/projected/05cbc202-6448-46cc-85d0-d4b432506ed5-kube-api-access-ngz7d\") on node \"crc\" DevicePath \"\"" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.202162 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.202172 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05cbc202-6448-46cc-85d0-d4b432506ed5-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.202182 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cbc202-6448-46cc-85d0-d4b432506ed5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.202190 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/05cbc202-6448-46cc-85d0-d4b432506ed5-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:56:18 crc kubenswrapper[4764]: I1001 16:56:18.960718 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b86858896-mnqsl" Oct 01 16:56:19 crc kubenswrapper[4764]: I1001 16:56:19.003329 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b86858896-mnqsl"] Oct 01 16:56:19 crc kubenswrapper[4764]: I1001 16:56:19.011572 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b86858896-mnqsl"] Oct 01 16:56:19 crc kubenswrapper[4764]: I1001 16:56:19.743160 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" path="/var/lib/kubelet/pods/05cbc202-6448-46cc-85d0-d4b432506ed5/volumes" Oct 01 16:56:32 crc kubenswrapper[4764]: I1001 16:56:32.722300 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:56:32 crc kubenswrapper[4764]: E1001 16:56:32.722939 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:56:44 crc kubenswrapper[4764]: I1001 16:56:44.721738 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:56:44 crc kubenswrapper[4764]: E1001 16:56:44.722726 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:56:55 crc kubenswrapper[4764]: I1001 16:56:55.723625 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:56:55 crc kubenswrapper[4764]: E1001 16:56:55.726330 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:57:08 crc kubenswrapper[4764]: I1001 16:57:08.723544 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:57:08 crc kubenswrapper[4764]: E1001 16:57:08.724693 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.065939 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 16:57:18 crc kubenswrapper[4764]: E1001 16:57:18.066907 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerName="horizon" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.066924 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerName="horizon" Oct 01 16:57:18 crc kubenswrapper[4764]: E1001 16:57:18.066975 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerName="horizon-log" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.066984 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerName="horizon-log" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.067221 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerName="horizon" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.067246 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="05cbc202-6448-46cc-85d0-d4b432506ed5" containerName="horizon-log" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.067977 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.070634 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-r78ls" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.070657 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.070695 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.071995 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.088596 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.184525 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xskh6\" (UniqueName: \"kubernetes.io/projected/0d43e380-092b-4488-9956-0ca607448dd4-kube-api-access-xskh6\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.184590 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0d43e380-092b-4488-9956-0ca607448dd4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.184663 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.184686 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d43e380-092b-4488-9956-0ca607448dd4-config-data\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.184758 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0d43e380-092b-4488-9956-0ca607448dd4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.184801 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.184831 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d43e380-092b-4488-9956-0ca607448dd4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.185007 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.185084 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.286884 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.286927 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.286973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xskh6\" (UniqueName: \"kubernetes.io/projected/0d43e380-092b-4488-9956-0ca607448dd4-kube-api-access-xskh6\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.286993 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0d43e380-092b-4488-9956-0ca607448dd4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.287068 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.287089 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d43e380-092b-4488-9956-0ca607448dd4-config-data\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.287146 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0d43e380-092b-4488-9956-0ca607448dd4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.287175 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.287192 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d43e380-092b-4488-9956-0ca607448dd4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.287446 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.287709 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0d43e380-092b-4488-9956-0ca607448dd4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.288001 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0d43e380-092b-4488-9956-0ca607448dd4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.288243 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d43e380-092b-4488-9956-0ca607448dd4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.288933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d43e380-092b-4488-9956-0ca607448dd4-config-data\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.298445 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.298661 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.300563 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.307118 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xskh6\" (UniqueName: \"kubernetes.io/projected/0d43e380-092b-4488-9956-0ca607448dd4-kube-api-access-xskh6\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.319263 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.397014 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.928872 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 16:57:18 crc kubenswrapper[4764]: I1001 16:57:18.940836 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 16:57:19 crc kubenswrapper[4764]: I1001 16:57:19.603838 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0d43e380-092b-4488-9956-0ca607448dd4","Type":"ContainerStarted","Data":"7471745d7d9f82d3a4d8298973668e8be0ffead56847a1050afd8c29b1d76d05"} Oct 01 16:57:20 crc kubenswrapper[4764]: I1001 16:57:20.722221 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:57:20 crc kubenswrapper[4764]: E1001 16:57:20.722510 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:57:31 crc kubenswrapper[4764]: I1001 16:57:31.723125 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:57:31 crc kubenswrapper[4764]: E1001 16:57:31.724088 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:57:43 crc kubenswrapper[4764]: I1001 16:57:43.504787 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k4rq4"] Oct 01 16:57:43 crc kubenswrapper[4764]: I1001 16:57:43.509128 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:57:43 crc kubenswrapper[4764]: I1001 16:57:43.517398 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4rq4"] Oct 01 16:57:43 crc kubenswrapper[4764]: I1001 16:57:43.607425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-utilities\") pod \"redhat-operators-k4rq4\" (UID: \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\") " pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:57:43 crc kubenswrapper[4764]: I1001 16:57:43.607569 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-catalog-content\") pod \"redhat-operators-k4rq4\" (UID: \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\") " pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:57:43 crc kubenswrapper[4764]: I1001 16:57:43.607691 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr488\" (UniqueName: \"kubernetes.io/projected/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-kube-api-access-jr488\") pod \"redhat-operators-k4rq4\" (UID: \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\") " pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:57:43 crc kubenswrapper[4764]: I1001 16:57:43.709813 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-utilities\") pod \"redhat-operators-k4rq4\" (UID: \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\") " pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:57:43 crc kubenswrapper[4764]: I1001 16:57:43.709937 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-catalog-content\") pod \"redhat-operators-k4rq4\" (UID: \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\") " pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:57:43 crc kubenswrapper[4764]: I1001 16:57:43.710072 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr488\" (UniqueName: \"kubernetes.io/projected/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-kube-api-access-jr488\") pod \"redhat-operators-k4rq4\" (UID: \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\") " pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:57:43 crc kubenswrapper[4764]: I1001 16:57:43.710352 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-utilities\") pod \"redhat-operators-k4rq4\" (UID: \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\") " pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:57:43 crc kubenswrapper[4764]: I1001 16:57:43.710502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-catalog-content\") pod \"redhat-operators-k4rq4\" (UID: \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\") " pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:57:43 crc kubenswrapper[4764]: I1001 16:57:43.744493 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr488\" (UniqueName: \"kubernetes.io/projected/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-kube-api-access-jr488\") pod \"redhat-operators-k4rq4\" (UID: \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\") " pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:57:43 crc kubenswrapper[4764]: I1001 16:57:43.840166 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:57:45 crc kubenswrapper[4764]: I1001 16:57:45.721856 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:57:45 crc kubenswrapper[4764]: E1001 16:57:45.722437 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:57:50 crc kubenswrapper[4764]: E1001 16:57:50.300246 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 01 16:57:50 crc kubenswrapper[4764]: E1001 16:57:50.300883 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xskh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(0d43e380-092b-4488-9956-0ca607448dd4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 16:57:50 crc kubenswrapper[4764]: E1001 16:57:50.302382 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="0d43e380-092b-4488-9956-0ca607448dd4" Oct 01 16:57:50 crc kubenswrapper[4764]: I1001 16:57:50.377027 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4rq4"] Oct 01 16:57:50 crc kubenswrapper[4764]: I1001 16:57:50.921889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4rq4" event={"ID":"38f0b0d0-0ac2-464c-8fa6-a1042b058daa","Type":"ContainerStarted","Data":"b4b0a854d792b3849434b300711798603b1530ba83ffadfcc5bf2a68ca60f945"} Oct 01 16:57:50 crc kubenswrapper[4764]: E1001 16:57:50.923753 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="0d43e380-092b-4488-9956-0ca607448dd4" Oct 01 16:57:52 crc kubenswrapper[4764]: E1001 16:57:52.146240 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38f0b0d0_0ac2_464c_8fa6_a1042b058daa.slice/crio-2c4b46f252db2cd84995cb5305cd9f95571fe3fc1948268f039f1b6f3d27383d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38f0b0d0_0ac2_464c_8fa6_a1042b058daa.slice/crio-conmon-2c4b46f252db2cd84995cb5305cd9f95571fe3fc1948268f039f1b6f3d27383d.scope\": RecentStats: unable to find data in memory cache]" Oct 01 16:57:52 crc kubenswrapper[4764]: I1001 16:57:52.958129 4764 generic.go:334] "Generic (PLEG): container finished" podID="38f0b0d0-0ac2-464c-8fa6-a1042b058daa" containerID="2c4b46f252db2cd84995cb5305cd9f95571fe3fc1948268f039f1b6f3d27383d" exitCode=0 Oct 01 16:57:52 crc kubenswrapper[4764]: I1001 16:57:52.958343 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4rq4" event={"ID":"38f0b0d0-0ac2-464c-8fa6-a1042b058daa","Type":"ContainerDied","Data":"2c4b46f252db2cd84995cb5305cd9f95571fe3fc1948268f039f1b6f3d27383d"} Oct 01 16:57:54 crc kubenswrapper[4764]: I1001 16:57:54.983747 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4rq4" event={"ID":"38f0b0d0-0ac2-464c-8fa6-a1042b058daa","Type":"ContainerStarted","Data":"dea81dde557da77c5fc01021f16258e428b735967c85452714f83e76417a674f"} Oct 01 16:58:00 crc kubenswrapper[4764]: I1001 16:58:00.721734 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:58:00 crc kubenswrapper[4764]: E1001 16:58:00.722849 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:58:03 crc kubenswrapper[4764]: I1001 16:58:03.084156 4764 generic.go:334] "Generic (PLEG): container finished" podID="38f0b0d0-0ac2-464c-8fa6-a1042b058daa" containerID="dea81dde557da77c5fc01021f16258e428b735967c85452714f83e76417a674f" exitCode=0 Oct 01 16:58:03 crc kubenswrapper[4764]: I1001 16:58:03.084229 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4rq4" event={"ID":"38f0b0d0-0ac2-464c-8fa6-a1042b058daa","Type":"ContainerDied","Data":"dea81dde557da77c5fc01021f16258e428b735967c85452714f83e76417a674f"} Oct 01 16:58:04 crc kubenswrapper[4764]: I1001 16:58:04.097958 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4rq4" event={"ID":"38f0b0d0-0ac2-464c-8fa6-a1042b058daa","Type":"ContainerStarted","Data":"54fa8d2e33d7260655d16711ae5fedd72b3ed835b03f37c23d3e8a9be845d8b3"} Oct 01 16:58:05 crc kubenswrapper[4764]: I1001 16:58:05.757641 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k4rq4" podStartSLOduration=12.113851674 podStartE2EDuration="22.757610573s" podCreationTimestamp="2025-10-01 16:57:43 +0000 UTC" firstStartedPulling="2025-10-01 16:57:52.961982352 +0000 UTC m=+3335.961629187" lastFinishedPulling="2025-10-01 16:58:03.605741251 +0000 UTC m=+3346.605388086" observedRunningTime="2025-10-01 16:58:04.123375744 +0000 UTC m=+3347.123022579" watchObservedRunningTime="2025-10-01 16:58:05.757610573 +0000 UTC m=+3348.757257448" Oct 01 16:58:10 crc kubenswrapper[4764]: I1001 16:58:10.175334 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0d43e380-092b-4488-9956-0ca607448dd4","Type":"ContainerStarted","Data":"713d7a7a5766f4149e580247991137a4cb8e892d501673921d7505f926ca11c7"} Oct 01 16:58:10 crc kubenswrapper[4764]: I1001 16:58:10.199654 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.713109483 podStartE2EDuration="53.199632019s" podCreationTimestamp="2025-10-01 16:57:17 +0000 UTC" firstStartedPulling="2025-10-01 16:57:18.940642205 +0000 UTC m=+3301.940289040" lastFinishedPulling="2025-10-01 16:58:07.427164731 +0000 UTC m=+3350.426811576" observedRunningTime="2025-10-01 16:58:10.199330632 +0000 UTC m=+3353.198977487" watchObservedRunningTime="2025-10-01 16:58:10.199632019 +0000 UTC m=+3353.199278864" Oct 01 16:58:12 crc kubenswrapper[4764]: I1001 16:58:12.721695 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:58:12 crc kubenswrapper[4764]: E1001 16:58:12.722625 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:58:13 crc kubenswrapper[4764]: I1001 16:58:13.844022 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:58:13 crc kubenswrapper[4764]: I1001 16:58:13.846071 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:58:13 crc kubenswrapper[4764]: I1001 16:58:13.922534 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:58:14 crc kubenswrapper[4764]: I1001 16:58:14.271476 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:58:14 crc kubenswrapper[4764]: I1001 16:58:14.701887 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k4rq4"] Oct 01 16:58:16 crc kubenswrapper[4764]: I1001 16:58:16.242591 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k4rq4" podUID="38f0b0d0-0ac2-464c-8fa6-a1042b058daa" containerName="registry-server" containerID="cri-o://54fa8d2e33d7260655d16711ae5fedd72b3ed835b03f37c23d3e8a9be845d8b3" gracePeriod=2 Oct 01 16:58:16 crc kubenswrapper[4764]: I1001 16:58:16.780321 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:58:16 crc kubenswrapper[4764]: I1001 16:58:16.789022 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-utilities\") pod \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\" (UID: \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\") " Oct 01 16:58:16 crc kubenswrapper[4764]: I1001 16:58:16.790091 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-utilities" (OuterVolumeSpecName: "utilities") pod "38f0b0d0-0ac2-464c-8fa6-a1042b058daa" (UID: "38f0b0d0-0ac2-464c-8fa6-a1042b058daa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:58:16 crc kubenswrapper[4764]: I1001 16:58:16.891655 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-catalog-content\") pod \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\" (UID: \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\") " Oct 01 16:58:16 crc kubenswrapper[4764]: I1001 16:58:16.891757 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr488\" (UniqueName: \"kubernetes.io/projected/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-kube-api-access-jr488\") pod \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\" (UID: \"38f0b0d0-0ac2-464c-8fa6-a1042b058daa\") " Oct 01 16:58:16 crc kubenswrapper[4764]: I1001 16:58:16.892454 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:58:16 crc kubenswrapper[4764]: I1001 16:58:16.898819 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-kube-api-access-jr488" (OuterVolumeSpecName: "kube-api-access-jr488") pod "38f0b0d0-0ac2-464c-8fa6-a1042b058daa" (UID: "38f0b0d0-0ac2-464c-8fa6-a1042b058daa"). InnerVolumeSpecName "kube-api-access-jr488". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:58:16 crc kubenswrapper[4764]: I1001 16:58:16.969480 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38f0b0d0-0ac2-464c-8fa6-a1042b058daa" (UID: "38f0b0d0-0ac2-464c-8fa6-a1042b058daa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:58:16 crc kubenswrapper[4764]: I1001 16:58:16.994872 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr488\" (UniqueName: \"kubernetes.io/projected/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-kube-api-access-jr488\") on node \"crc\" DevicePath \"\"" Oct 01 16:58:16 crc kubenswrapper[4764]: I1001 16:58:16.994925 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38f0b0d0-0ac2-464c-8fa6-a1042b058daa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.254592 4764 generic.go:334] "Generic (PLEG): container finished" podID="38f0b0d0-0ac2-464c-8fa6-a1042b058daa" containerID="54fa8d2e33d7260655d16711ae5fedd72b3ed835b03f37c23d3e8a9be845d8b3" exitCode=0 Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.254646 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4rq4" event={"ID":"38f0b0d0-0ac2-464c-8fa6-a1042b058daa","Type":"ContainerDied","Data":"54fa8d2e33d7260655d16711ae5fedd72b3ed835b03f37c23d3e8a9be845d8b3"} Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.254702 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4rq4" Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.254736 4764 scope.go:117] "RemoveContainer" containerID="54fa8d2e33d7260655d16711ae5fedd72b3ed835b03f37c23d3e8a9be845d8b3" Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.254720 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4rq4" event={"ID":"38f0b0d0-0ac2-464c-8fa6-a1042b058daa","Type":"ContainerDied","Data":"b4b0a854d792b3849434b300711798603b1530ba83ffadfcc5bf2a68ca60f945"} Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.298490 4764 scope.go:117] "RemoveContainer" containerID="dea81dde557da77c5fc01021f16258e428b735967c85452714f83e76417a674f" Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.310867 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k4rq4"] Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.325869 4764 scope.go:117] "RemoveContainer" containerID="2c4b46f252db2cd84995cb5305cd9f95571fe3fc1948268f039f1b6f3d27383d" Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.326708 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k4rq4"] Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.376453 4764 scope.go:117] "RemoveContainer" containerID="54fa8d2e33d7260655d16711ae5fedd72b3ed835b03f37c23d3e8a9be845d8b3" Oct 01 16:58:17 crc kubenswrapper[4764]: E1001 16:58:17.377099 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54fa8d2e33d7260655d16711ae5fedd72b3ed835b03f37c23d3e8a9be845d8b3\": container with ID starting with 54fa8d2e33d7260655d16711ae5fedd72b3ed835b03f37c23d3e8a9be845d8b3 not found: ID does not exist" containerID="54fa8d2e33d7260655d16711ae5fedd72b3ed835b03f37c23d3e8a9be845d8b3" Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.377153 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54fa8d2e33d7260655d16711ae5fedd72b3ed835b03f37c23d3e8a9be845d8b3"} err="failed to get container status \"54fa8d2e33d7260655d16711ae5fedd72b3ed835b03f37c23d3e8a9be845d8b3\": rpc error: code = NotFound desc = could not find container \"54fa8d2e33d7260655d16711ae5fedd72b3ed835b03f37c23d3e8a9be845d8b3\": container with ID starting with 54fa8d2e33d7260655d16711ae5fedd72b3ed835b03f37c23d3e8a9be845d8b3 not found: ID does not exist" Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.377184 4764 scope.go:117] "RemoveContainer" containerID="dea81dde557da77c5fc01021f16258e428b735967c85452714f83e76417a674f" Oct 01 16:58:17 crc kubenswrapper[4764]: E1001 16:58:17.377631 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea81dde557da77c5fc01021f16258e428b735967c85452714f83e76417a674f\": container with ID starting with dea81dde557da77c5fc01021f16258e428b735967c85452714f83e76417a674f not found: ID does not exist" containerID="dea81dde557da77c5fc01021f16258e428b735967c85452714f83e76417a674f" Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.377653 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea81dde557da77c5fc01021f16258e428b735967c85452714f83e76417a674f"} err="failed to get container status \"dea81dde557da77c5fc01021f16258e428b735967c85452714f83e76417a674f\": rpc error: code = NotFound desc = could not find container \"dea81dde557da77c5fc01021f16258e428b735967c85452714f83e76417a674f\": container with ID starting with dea81dde557da77c5fc01021f16258e428b735967c85452714f83e76417a674f not found: ID does not exist" Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.377665 4764 scope.go:117] "RemoveContainer" containerID="2c4b46f252db2cd84995cb5305cd9f95571fe3fc1948268f039f1b6f3d27383d" Oct 01 16:58:17 crc kubenswrapper[4764]: E1001 16:58:17.377901 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c4b46f252db2cd84995cb5305cd9f95571fe3fc1948268f039f1b6f3d27383d\": container with ID starting with 2c4b46f252db2cd84995cb5305cd9f95571fe3fc1948268f039f1b6f3d27383d not found: ID does not exist" containerID="2c4b46f252db2cd84995cb5305cd9f95571fe3fc1948268f039f1b6f3d27383d" Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.377930 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c4b46f252db2cd84995cb5305cd9f95571fe3fc1948268f039f1b6f3d27383d"} err="failed to get container status \"2c4b46f252db2cd84995cb5305cd9f95571fe3fc1948268f039f1b6f3d27383d\": rpc error: code = NotFound desc = could not find container \"2c4b46f252db2cd84995cb5305cd9f95571fe3fc1948268f039f1b6f3d27383d\": container with ID starting with 2c4b46f252db2cd84995cb5305cd9f95571fe3fc1948268f039f1b6f3d27383d not found: ID does not exist" Oct 01 16:58:17 crc kubenswrapper[4764]: I1001 16:58:17.738725 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f0b0d0-0ac2-464c-8fa6-a1042b058daa" path="/var/lib/kubelet/pods/38f0b0d0-0ac2-464c-8fa6-a1042b058daa/volumes" Oct 01 16:58:25 crc kubenswrapper[4764]: I1001 16:58:25.725424 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:58:25 crc kubenswrapper[4764]: E1001 16:58:25.726648 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:58:38 crc kubenswrapper[4764]: I1001 16:58:38.722536 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:58:38 crc kubenswrapper[4764]: E1001 16:58:38.723364 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:58:49 crc kubenswrapper[4764]: I1001 16:58:49.721926 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:58:49 crc kubenswrapper[4764]: E1001 16:58:49.723214 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:59:02 crc kubenswrapper[4764]: I1001 16:59:02.722769 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:59:02 crc kubenswrapper[4764]: E1001 16:59:02.723828 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:59:15 crc kubenswrapper[4764]: I1001 16:59:15.730134 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:59:15 crc kubenswrapper[4764]: E1001 16:59:15.730878 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 16:59:26 crc kubenswrapper[4764]: I1001 16:59:26.722243 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 16:59:27 crc kubenswrapper[4764]: I1001 16:59:27.028641 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"e70823572738d7495a643f986b0c30a203fe5ac2c2c10b874727a1af42da7b92"} Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.171632 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28"] Oct 01 17:00:00 crc kubenswrapper[4764]: E1001 17:00:00.172569 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f0b0d0-0ac2-464c-8fa6-a1042b058daa" containerName="registry-server" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.172583 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f0b0d0-0ac2-464c-8fa6-a1042b058daa" containerName="registry-server" Oct 01 17:00:00 crc kubenswrapper[4764]: E1001 17:00:00.172594 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f0b0d0-0ac2-464c-8fa6-a1042b058daa" containerName="extract-utilities" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.172600 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f0b0d0-0ac2-464c-8fa6-a1042b058daa" containerName="extract-utilities" Oct 01 17:00:00 crc kubenswrapper[4764]: E1001 17:00:00.172624 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f0b0d0-0ac2-464c-8fa6-a1042b058daa" containerName="extract-content" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.172630 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f0b0d0-0ac2-464c-8fa6-a1042b058daa" containerName="extract-content" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.172865 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f0b0d0-0ac2-464c-8fa6-a1042b058daa" containerName="registry-server" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.173814 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.176301 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.177823 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.189203 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28"] Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.285226 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vhlk\" (UniqueName: \"kubernetes.io/projected/891dadf0-af3c-4405-8fcf-28c5e15abc37-kube-api-access-2vhlk\") pod \"collect-profiles-29322300-v9s28\" (UID: \"891dadf0-af3c-4405-8fcf-28c5e15abc37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.285313 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/891dadf0-af3c-4405-8fcf-28c5e15abc37-config-volume\") pod \"collect-profiles-29322300-v9s28\" (UID: \"891dadf0-af3c-4405-8fcf-28c5e15abc37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.285352 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/891dadf0-af3c-4405-8fcf-28c5e15abc37-secret-volume\") pod \"collect-profiles-29322300-v9s28\" (UID: \"891dadf0-af3c-4405-8fcf-28c5e15abc37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.386652 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vhlk\" (UniqueName: \"kubernetes.io/projected/891dadf0-af3c-4405-8fcf-28c5e15abc37-kube-api-access-2vhlk\") pod \"collect-profiles-29322300-v9s28\" (UID: \"891dadf0-af3c-4405-8fcf-28c5e15abc37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.386712 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/891dadf0-af3c-4405-8fcf-28c5e15abc37-config-volume\") pod \"collect-profiles-29322300-v9s28\" (UID: \"891dadf0-af3c-4405-8fcf-28c5e15abc37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.386741 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/891dadf0-af3c-4405-8fcf-28c5e15abc37-secret-volume\") pod \"collect-profiles-29322300-v9s28\" (UID: \"891dadf0-af3c-4405-8fcf-28c5e15abc37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.388506 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/891dadf0-af3c-4405-8fcf-28c5e15abc37-config-volume\") pod \"collect-profiles-29322300-v9s28\" (UID: \"891dadf0-af3c-4405-8fcf-28c5e15abc37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.400985 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/891dadf0-af3c-4405-8fcf-28c5e15abc37-secret-volume\") pod \"collect-profiles-29322300-v9s28\" (UID: \"891dadf0-af3c-4405-8fcf-28c5e15abc37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.407866 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vhlk\" (UniqueName: \"kubernetes.io/projected/891dadf0-af3c-4405-8fcf-28c5e15abc37-kube-api-access-2vhlk\") pod \"collect-profiles-29322300-v9s28\" (UID: \"891dadf0-af3c-4405-8fcf-28c5e15abc37\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" Oct 01 17:00:00 crc kubenswrapper[4764]: I1001 17:00:00.502035 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" Oct 01 17:00:01 crc kubenswrapper[4764]: I1001 17:00:01.002642 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28"] Oct 01 17:00:01 crc kubenswrapper[4764]: I1001 17:00:01.384092 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" event={"ID":"891dadf0-af3c-4405-8fcf-28c5e15abc37","Type":"ContainerStarted","Data":"e1cf2282acaaf9b968bd4313c203afc6a4c9ff7f34705ff10a21565fe155910c"} Oct 01 17:00:01 crc kubenswrapper[4764]: I1001 17:00:01.385204 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" event={"ID":"891dadf0-af3c-4405-8fcf-28c5e15abc37","Type":"ContainerStarted","Data":"166cf6f919206948f62e15eadb1e793731104f4122339cb95bbdc2464979a872"} Oct 01 17:00:01 crc kubenswrapper[4764]: I1001 17:00:01.420777 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" podStartSLOduration=1.420761931 podStartE2EDuration="1.420761931s" podCreationTimestamp="2025-10-01 17:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 17:00:01.416999898 +0000 UTC m=+3464.416646733" watchObservedRunningTime="2025-10-01 17:00:01.420761931 +0000 UTC m=+3464.420408766" Oct 01 17:00:02 crc kubenswrapper[4764]: I1001 17:00:02.396690 4764 generic.go:334] "Generic (PLEG): container finished" podID="891dadf0-af3c-4405-8fcf-28c5e15abc37" containerID="e1cf2282acaaf9b968bd4313c203afc6a4c9ff7f34705ff10a21565fe155910c" exitCode=0 Oct 01 17:00:02 crc kubenswrapper[4764]: I1001 17:00:02.396802 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" event={"ID":"891dadf0-af3c-4405-8fcf-28c5e15abc37","Type":"ContainerDied","Data":"e1cf2282acaaf9b968bd4313c203afc6a4c9ff7f34705ff10a21565fe155910c"} Oct 01 17:00:03 crc kubenswrapper[4764]: I1001 17:00:03.809595 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" Oct 01 17:00:03 crc kubenswrapper[4764]: I1001 17:00:03.880585 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/891dadf0-af3c-4405-8fcf-28c5e15abc37-config-volume\") pod \"891dadf0-af3c-4405-8fcf-28c5e15abc37\" (UID: \"891dadf0-af3c-4405-8fcf-28c5e15abc37\") " Oct 01 17:00:03 crc kubenswrapper[4764]: I1001 17:00:03.880650 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vhlk\" (UniqueName: \"kubernetes.io/projected/891dadf0-af3c-4405-8fcf-28c5e15abc37-kube-api-access-2vhlk\") pod \"891dadf0-af3c-4405-8fcf-28c5e15abc37\" (UID: \"891dadf0-af3c-4405-8fcf-28c5e15abc37\") " Oct 01 17:00:03 crc kubenswrapper[4764]: I1001 17:00:03.880790 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/891dadf0-af3c-4405-8fcf-28c5e15abc37-secret-volume\") pod \"891dadf0-af3c-4405-8fcf-28c5e15abc37\" (UID: \"891dadf0-af3c-4405-8fcf-28c5e15abc37\") " Oct 01 17:00:03 crc kubenswrapper[4764]: I1001 17:00:03.882840 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/891dadf0-af3c-4405-8fcf-28c5e15abc37-config-volume" (OuterVolumeSpecName: "config-volume") pod "891dadf0-af3c-4405-8fcf-28c5e15abc37" (UID: "891dadf0-af3c-4405-8fcf-28c5e15abc37"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 17:00:03 crc kubenswrapper[4764]: I1001 17:00:03.887427 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/891dadf0-af3c-4405-8fcf-28c5e15abc37-kube-api-access-2vhlk" (OuterVolumeSpecName: "kube-api-access-2vhlk") pod "891dadf0-af3c-4405-8fcf-28c5e15abc37" (UID: "891dadf0-af3c-4405-8fcf-28c5e15abc37"). InnerVolumeSpecName "kube-api-access-2vhlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:00:03 crc kubenswrapper[4764]: I1001 17:00:03.891663 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/891dadf0-af3c-4405-8fcf-28c5e15abc37-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "891dadf0-af3c-4405-8fcf-28c5e15abc37" (UID: "891dadf0-af3c-4405-8fcf-28c5e15abc37"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 17:00:03 crc kubenswrapper[4764]: I1001 17:00:03.983065 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/891dadf0-af3c-4405-8fcf-28c5e15abc37-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 17:00:03 crc kubenswrapper[4764]: I1001 17:00:03.983103 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vhlk\" (UniqueName: \"kubernetes.io/projected/891dadf0-af3c-4405-8fcf-28c5e15abc37-kube-api-access-2vhlk\") on node \"crc\" DevicePath \"\"" Oct 01 17:00:03 crc kubenswrapper[4764]: I1001 17:00:03.983120 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/891dadf0-af3c-4405-8fcf-28c5e15abc37-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 17:00:04 crc kubenswrapper[4764]: I1001 17:00:04.419709 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" event={"ID":"891dadf0-af3c-4405-8fcf-28c5e15abc37","Type":"ContainerDied","Data":"166cf6f919206948f62e15eadb1e793731104f4122339cb95bbdc2464979a872"} Oct 01 17:00:04 crc kubenswrapper[4764]: I1001 17:00:04.419806 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="166cf6f919206948f62e15eadb1e793731104f4122339cb95bbdc2464979a872" Oct 01 17:00:04 crc kubenswrapper[4764]: I1001 17:00:04.419818 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322300-v9s28" Oct 01 17:00:04 crc kubenswrapper[4764]: I1001 17:00:04.895259 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn"] Oct 01 17:00:04 crc kubenswrapper[4764]: I1001 17:00:04.902916 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322255-w4lzn"] Oct 01 17:00:05 crc kubenswrapper[4764]: I1001 17:00:05.735769 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f7b9604-377c-4942-8802-a802ab7f9783" path="/var/lib/kubelet/pods/3f7b9604-377c-4942-8802-a802ab7f9783/volumes" Oct 01 17:00:53 crc kubenswrapper[4764]: I1001 17:00:53.730269 4764 scope.go:117] "RemoveContainer" containerID="97b9dc84f2670f9df086ec1f1029c96c15dd629cdc28af00b32f5e084c9ad69b" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.152611 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29322301-kgbcw"] Oct 01 17:01:00 crc kubenswrapper[4764]: E1001 17:01:00.153700 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="891dadf0-af3c-4405-8fcf-28c5e15abc37" containerName="collect-profiles" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.153718 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="891dadf0-af3c-4405-8fcf-28c5e15abc37" containerName="collect-profiles" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.153922 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="891dadf0-af3c-4405-8fcf-28c5e15abc37" containerName="collect-profiles" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.154607 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.173092 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322301-kgbcw"] Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.277146 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t44t\" (UniqueName: \"kubernetes.io/projected/c858aedc-be1d-4bd6-8c80-906c5345a7df-kube-api-access-4t44t\") pod \"keystone-cron-29322301-kgbcw\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.277202 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-fernet-keys\") pod \"keystone-cron-29322301-kgbcw\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.277421 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-combined-ca-bundle\") pod \"keystone-cron-29322301-kgbcw\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.277794 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-config-data\") pod \"keystone-cron-29322301-kgbcw\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.379595 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-config-data\") pod \"keystone-cron-29322301-kgbcw\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.379717 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-fernet-keys\") pod \"keystone-cron-29322301-kgbcw\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.379737 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t44t\" (UniqueName: \"kubernetes.io/projected/c858aedc-be1d-4bd6-8c80-906c5345a7df-kube-api-access-4t44t\") pod \"keystone-cron-29322301-kgbcw\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.379769 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-combined-ca-bundle\") pod \"keystone-cron-29322301-kgbcw\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.386016 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-config-data\") pod \"keystone-cron-29322301-kgbcw\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.387248 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-fernet-keys\") pod \"keystone-cron-29322301-kgbcw\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.395499 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t44t\" (UniqueName: \"kubernetes.io/projected/c858aedc-be1d-4bd6-8c80-906c5345a7df-kube-api-access-4t44t\") pod \"keystone-cron-29322301-kgbcw\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.396591 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-combined-ca-bundle\") pod \"keystone-cron-29322301-kgbcw\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:00 crc kubenswrapper[4764]: I1001 17:01:00.478824 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:01 crc kubenswrapper[4764]: I1001 17:01:01.010292 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322301-kgbcw"] Oct 01 17:01:01 crc kubenswrapper[4764]: I1001 17:01:01.998517 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322301-kgbcw" event={"ID":"c858aedc-be1d-4bd6-8c80-906c5345a7df","Type":"ContainerStarted","Data":"841d2ff518ca323edb3a82a9edb60a23790ee55b7773e1c761370b7bce5bcdea"} Oct 01 17:01:01 crc kubenswrapper[4764]: I1001 17:01:01.998772 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322301-kgbcw" event={"ID":"c858aedc-be1d-4bd6-8c80-906c5345a7df","Type":"ContainerStarted","Data":"ac20c9e97fc99d7d39f8f1722c86ac108820ed970d24d5d5e34057a1920ca542"} Oct 01 17:01:02 crc kubenswrapper[4764]: I1001 17:01:02.027267 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29322301-kgbcw" podStartSLOduration=2.027242396 podStartE2EDuration="2.027242396s" podCreationTimestamp="2025-10-01 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 17:01:02.016976654 +0000 UTC m=+3525.016623489" watchObservedRunningTime="2025-10-01 17:01:02.027242396 +0000 UTC m=+3525.026889231" Oct 01 17:01:04 crc kubenswrapper[4764]: I1001 17:01:04.021664 4764 generic.go:334] "Generic (PLEG): container finished" podID="c858aedc-be1d-4bd6-8c80-906c5345a7df" containerID="841d2ff518ca323edb3a82a9edb60a23790ee55b7773e1c761370b7bce5bcdea" exitCode=0 Oct 01 17:01:04 crc kubenswrapper[4764]: I1001 17:01:04.022109 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322301-kgbcw" event={"ID":"c858aedc-be1d-4bd6-8c80-906c5345a7df","Type":"ContainerDied","Data":"841d2ff518ca323edb3a82a9edb60a23790ee55b7773e1c761370b7bce5bcdea"} Oct 01 17:01:05 crc kubenswrapper[4764]: I1001 17:01:05.540103 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:05 crc kubenswrapper[4764]: I1001 17:01:05.585387 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-config-data\") pod \"c858aedc-be1d-4bd6-8c80-906c5345a7df\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " Oct 01 17:01:05 crc kubenswrapper[4764]: I1001 17:01:05.585602 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-fernet-keys\") pod \"c858aedc-be1d-4bd6-8c80-906c5345a7df\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " Oct 01 17:01:05 crc kubenswrapper[4764]: I1001 17:01:05.585706 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t44t\" (UniqueName: \"kubernetes.io/projected/c858aedc-be1d-4bd6-8c80-906c5345a7df-kube-api-access-4t44t\") pod \"c858aedc-be1d-4bd6-8c80-906c5345a7df\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " Oct 01 17:01:05 crc kubenswrapper[4764]: I1001 17:01:05.585855 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-combined-ca-bundle\") pod \"c858aedc-be1d-4bd6-8c80-906c5345a7df\" (UID: \"c858aedc-be1d-4bd6-8c80-906c5345a7df\") " Oct 01 17:01:05 crc kubenswrapper[4764]: I1001 17:01:05.593312 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c858aedc-be1d-4bd6-8c80-906c5345a7df" (UID: "c858aedc-be1d-4bd6-8c80-906c5345a7df"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 17:01:05 crc kubenswrapper[4764]: I1001 17:01:05.594334 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c858aedc-be1d-4bd6-8c80-906c5345a7df-kube-api-access-4t44t" (OuterVolumeSpecName: "kube-api-access-4t44t") pod "c858aedc-be1d-4bd6-8c80-906c5345a7df" (UID: "c858aedc-be1d-4bd6-8c80-906c5345a7df"). InnerVolumeSpecName "kube-api-access-4t44t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:01:05 crc kubenswrapper[4764]: I1001 17:01:05.626370 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c858aedc-be1d-4bd6-8c80-906c5345a7df" (UID: "c858aedc-be1d-4bd6-8c80-906c5345a7df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 17:01:05 crc kubenswrapper[4764]: I1001 17:01:05.654078 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-config-data" (OuterVolumeSpecName: "config-data") pod "c858aedc-be1d-4bd6-8c80-906c5345a7df" (UID: "c858aedc-be1d-4bd6-8c80-906c5345a7df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 17:01:05 crc kubenswrapper[4764]: I1001 17:01:05.688383 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t44t\" (UniqueName: \"kubernetes.io/projected/c858aedc-be1d-4bd6-8c80-906c5345a7df-kube-api-access-4t44t\") on node \"crc\" DevicePath \"\"" Oct 01 17:01:05 crc kubenswrapper[4764]: I1001 17:01:05.688425 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 17:01:05 crc kubenswrapper[4764]: I1001 17:01:05.688438 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 17:01:05 crc kubenswrapper[4764]: I1001 17:01:05.688450 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c858aedc-be1d-4bd6-8c80-906c5345a7df-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 17:01:06 crc kubenswrapper[4764]: I1001 17:01:06.042426 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322301-kgbcw" event={"ID":"c858aedc-be1d-4bd6-8c80-906c5345a7df","Type":"ContainerDied","Data":"ac20c9e97fc99d7d39f8f1722c86ac108820ed970d24d5d5e34057a1920ca542"} Oct 01 17:01:06 crc kubenswrapper[4764]: I1001 17:01:06.042471 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac20c9e97fc99d7d39f8f1722c86ac108820ed970d24d5d5e34057a1920ca542" Oct 01 17:01:06 crc kubenswrapper[4764]: I1001 17:01:06.042538 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322301-kgbcw" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.277972 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jtvhp"] Oct 01 17:01:20 crc kubenswrapper[4764]: E1001 17:01:20.279154 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c858aedc-be1d-4bd6-8c80-906c5345a7df" containerName="keystone-cron" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.279177 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c858aedc-be1d-4bd6-8c80-906c5345a7df" containerName="keystone-cron" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.279398 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c858aedc-be1d-4bd6-8c80-906c5345a7df" containerName="keystone-cron" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.281092 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.288694 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jtvhp"] Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.378735 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c1f72b-eff5-433e-be34-554e2327106c-catalog-content\") pod \"community-operators-jtvhp\" (UID: \"48c1f72b-eff5-433e-be34-554e2327106c\") " pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.378795 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n969\" (UniqueName: \"kubernetes.io/projected/48c1f72b-eff5-433e-be34-554e2327106c-kube-api-access-9n969\") pod \"community-operators-jtvhp\" (UID: \"48c1f72b-eff5-433e-be34-554e2327106c\") " pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.378985 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c1f72b-eff5-433e-be34-554e2327106c-utilities\") pod \"community-operators-jtvhp\" (UID: \"48c1f72b-eff5-433e-be34-554e2327106c\") " pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.481183 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c1f72b-eff5-433e-be34-554e2327106c-catalog-content\") pod \"community-operators-jtvhp\" (UID: \"48c1f72b-eff5-433e-be34-554e2327106c\") " pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.481254 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n969\" (UniqueName: \"kubernetes.io/projected/48c1f72b-eff5-433e-be34-554e2327106c-kube-api-access-9n969\") pod \"community-operators-jtvhp\" (UID: \"48c1f72b-eff5-433e-be34-554e2327106c\") " pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.481332 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c1f72b-eff5-433e-be34-554e2327106c-utilities\") pod \"community-operators-jtvhp\" (UID: \"48c1f72b-eff5-433e-be34-554e2327106c\") " pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.481741 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c1f72b-eff5-433e-be34-554e2327106c-catalog-content\") pod \"community-operators-jtvhp\" (UID: \"48c1f72b-eff5-433e-be34-554e2327106c\") " pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.481803 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c1f72b-eff5-433e-be34-554e2327106c-utilities\") pod \"community-operators-jtvhp\" (UID: \"48c1f72b-eff5-433e-be34-554e2327106c\") " pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.501185 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n969\" (UniqueName: \"kubernetes.io/projected/48c1f72b-eff5-433e-be34-554e2327106c-kube-api-access-9n969\") pod \"community-operators-jtvhp\" (UID: \"48c1f72b-eff5-433e-be34-554e2327106c\") " pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:20 crc kubenswrapper[4764]: I1001 17:01:20.600168 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:21 crc kubenswrapper[4764]: I1001 17:01:21.125475 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jtvhp"] Oct 01 17:01:21 crc kubenswrapper[4764]: W1001 17:01:21.138152 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48c1f72b_eff5_433e_be34_554e2327106c.slice/crio-69b14115a740556e3c6fbb1cdafef06e8fde783985a89cf5e7eb8b80cbc85a7b WatchSource:0}: Error finding container 69b14115a740556e3c6fbb1cdafef06e8fde783985a89cf5e7eb8b80cbc85a7b: Status 404 returned error can't find the container with id 69b14115a740556e3c6fbb1cdafef06e8fde783985a89cf5e7eb8b80cbc85a7b Oct 01 17:01:21 crc kubenswrapper[4764]: I1001 17:01:21.189019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtvhp" event={"ID":"48c1f72b-eff5-433e-be34-554e2327106c","Type":"ContainerStarted","Data":"69b14115a740556e3c6fbb1cdafef06e8fde783985a89cf5e7eb8b80cbc85a7b"} Oct 01 17:01:22 crc kubenswrapper[4764]: I1001 17:01:22.199058 4764 generic.go:334] "Generic (PLEG): container finished" podID="48c1f72b-eff5-433e-be34-554e2327106c" containerID="3bb9ec21c6f1c0b029f4ecbaaaaa3cd7a2a99b6ff1dcb8f0a53735adf1c84e7e" exitCode=0 Oct 01 17:01:22 crc kubenswrapper[4764]: I1001 17:01:22.199118 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtvhp" event={"ID":"48c1f72b-eff5-433e-be34-554e2327106c","Type":"ContainerDied","Data":"3bb9ec21c6f1c0b029f4ecbaaaaa3cd7a2a99b6ff1dcb8f0a53735adf1c84e7e"} Oct 01 17:01:23 crc kubenswrapper[4764]: I1001 17:01:23.211182 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtvhp" event={"ID":"48c1f72b-eff5-433e-be34-554e2327106c","Type":"ContainerStarted","Data":"0144b324b606a0ee36b9d2f4139bb0faf15ab195c20abe107fc58a7e70e62005"} Oct 01 17:01:24 crc kubenswrapper[4764]: I1001 17:01:24.223258 4764 generic.go:334] "Generic (PLEG): container finished" podID="48c1f72b-eff5-433e-be34-554e2327106c" containerID="0144b324b606a0ee36b9d2f4139bb0faf15ab195c20abe107fc58a7e70e62005" exitCode=0 Oct 01 17:01:24 crc kubenswrapper[4764]: I1001 17:01:24.223345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtvhp" event={"ID":"48c1f72b-eff5-433e-be34-554e2327106c","Type":"ContainerDied","Data":"0144b324b606a0ee36b9d2f4139bb0faf15ab195c20abe107fc58a7e70e62005"} Oct 01 17:01:26 crc kubenswrapper[4764]: I1001 17:01:26.251960 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtvhp" event={"ID":"48c1f72b-eff5-433e-be34-554e2327106c","Type":"ContainerStarted","Data":"e286c71bb85aace80b39255cba6ab736097e3d8bdba4571365462818df207dbb"} Oct 01 17:01:26 crc kubenswrapper[4764]: I1001 17:01:26.274588 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jtvhp" podStartSLOduration=3.317617002 podStartE2EDuration="6.274566497s" podCreationTimestamp="2025-10-01 17:01:20 +0000 UTC" firstStartedPulling="2025-10-01 17:01:22.201738093 +0000 UTC m=+3545.201384918" lastFinishedPulling="2025-10-01 17:01:25.158687588 +0000 UTC m=+3548.158334413" observedRunningTime="2025-10-01 17:01:26.26900739 +0000 UTC m=+3549.268654225" watchObservedRunningTime="2025-10-01 17:01:26.274566497 +0000 UTC m=+3549.274213332" Oct 01 17:01:30 crc kubenswrapper[4764]: I1001 17:01:30.600651 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:30 crc kubenswrapper[4764]: I1001 17:01:30.601389 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:30 crc kubenswrapper[4764]: I1001 17:01:30.677159 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:31 crc kubenswrapper[4764]: I1001 17:01:31.371746 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:31 crc kubenswrapper[4764]: I1001 17:01:31.422773 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jtvhp"] Oct 01 17:01:33 crc kubenswrapper[4764]: I1001 17:01:33.318320 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jtvhp" podUID="48c1f72b-eff5-433e-be34-554e2327106c" containerName="registry-server" containerID="cri-o://e286c71bb85aace80b39255cba6ab736097e3d8bdba4571365462818df207dbb" gracePeriod=2 Oct 01 17:01:33 crc kubenswrapper[4764]: I1001 17:01:33.924137 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.080660 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c1f72b-eff5-433e-be34-554e2327106c-utilities\") pod \"48c1f72b-eff5-433e-be34-554e2327106c\" (UID: \"48c1f72b-eff5-433e-be34-554e2327106c\") " Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.080805 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c1f72b-eff5-433e-be34-554e2327106c-catalog-content\") pod \"48c1f72b-eff5-433e-be34-554e2327106c\" (UID: \"48c1f72b-eff5-433e-be34-554e2327106c\") " Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.080867 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n969\" (UniqueName: \"kubernetes.io/projected/48c1f72b-eff5-433e-be34-554e2327106c-kube-api-access-9n969\") pod \"48c1f72b-eff5-433e-be34-554e2327106c\" (UID: \"48c1f72b-eff5-433e-be34-554e2327106c\") " Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.082153 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c1f72b-eff5-433e-be34-554e2327106c-utilities" (OuterVolumeSpecName: "utilities") pod "48c1f72b-eff5-433e-be34-554e2327106c" (UID: "48c1f72b-eff5-433e-be34-554e2327106c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.089230 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c1f72b-eff5-433e-be34-554e2327106c-kube-api-access-9n969" (OuterVolumeSpecName: "kube-api-access-9n969") pod "48c1f72b-eff5-433e-be34-554e2327106c" (UID: "48c1f72b-eff5-433e-be34-554e2327106c"). InnerVolumeSpecName "kube-api-access-9n969". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.183251 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n969\" (UniqueName: \"kubernetes.io/projected/48c1f72b-eff5-433e-be34-554e2327106c-kube-api-access-9n969\") on node \"crc\" DevicePath \"\"" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.183284 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48c1f72b-eff5-433e-be34-554e2327106c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.329546 4764 generic.go:334] "Generic (PLEG): container finished" podID="48c1f72b-eff5-433e-be34-554e2327106c" containerID="e286c71bb85aace80b39255cba6ab736097e3d8bdba4571365462818df207dbb" exitCode=0 Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.329730 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtvhp" event={"ID":"48c1f72b-eff5-433e-be34-554e2327106c","Type":"ContainerDied","Data":"e286c71bb85aace80b39255cba6ab736097e3d8bdba4571365462818df207dbb"} Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.329920 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jtvhp" event={"ID":"48c1f72b-eff5-433e-be34-554e2327106c","Type":"ContainerDied","Data":"69b14115a740556e3c6fbb1cdafef06e8fde783985a89cf5e7eb8b80cbc85a7b"} Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.329950 4764 scope.go:117] "RemoveContainer" containerID="e286c71bb85aace80b39255cba6ab736097e3d8bdba4571365462818df207dbb" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.329793 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jtvhp" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.365722 4764 scope.go:117] "RemoveContainer" containerID="0144b324b606a0ee36b9d2f4139bb0faf15ab195c20abe107fc58a7e70e62005" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.388418 4764 scope.go:117] "RemoveContainer" containerID="3bb9ec21c6f1c0b029f4ecbaaaaa3cd7a2a99b6ff1dcb8f0a53735adf1c84e7e" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.442442 4764 scope.go:117] "RemoveContainer" containerID="e286c71bb85aace80b39255cba6ab736097e3d8bdba4571365462818df207dbb" Oct 01 17:01:34 crc kubenswrapper[4764]: E1001 17:01:34.442956 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e286c71bb85aace80b39255cba6ab736097e3d8bdba4571365462818df207dbb\": container with ID starting with e286c71bb85aace80b39255cba6ab736097e3d8bdba4571365462818df207dbb not found: ID does not exist" containerID="e286c71bb85aace80b39255cba6ab736097e3d8bdba4571365462818df207dbb" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.442997 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e286c71bb85aace80b39255cba6ab736097e3d8bdba4571365462818df207dbb"} err="failed to get container status \"e286c71bb85aace80b39255cba6ab736097e3d8bdba4571365462818df207dbb\": rpc error: code = NotFound desc = could not find container \"e286c71bb85aace80b39255cba6ab736097e3d8bdba4571365462818df207dbb\": container with ID starting with e286c71bb85aace80b39255cba6ab736097e3d8bdba4571365462818df207dbb not found: ID does not exist" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.443023 4764 scope.go:117] "RemoveContainer" containerID="0144b324b606a0ee36b9d2f4139bb0faf15ab195c20abe107fc58a7e70e62005" Oct 01 17:01:34 crc kubenswrapper[4764]: E1001 17:01:34.443308 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0144b324b606a0ee36b9d2f4139bb0faf15ab195c20abe107fc58a7e70e62005\": container with ID starting with 0144b324b606a0ee36b9d2f4139bb0faf15ab195c20abe107fc58a7e70e62005 not found: ID does not exist" containerID="0144b324b606a0ee36b9d2f4139bb0faf15ab195c20abe107fc58a7e70e62005" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.443341 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0144b324b606a0ee36b9d2f4139bb0faf15ab195c20abe107fc58a7e70e62005"} err="failed to get container status \"0144b324b606a0ee36b9d2f4139bb0faf15ab195c20abe107fc58a7e70e62005\": rpc error: code = NotFound desc = could not find container \"0144b324b606a0ee36b9d2f4139bb0faf15ab195c20abe107fc58a7e70e62005\": container with ID starting with 0144b324b606a0ee36b9d2f4139bb0faf15ab195c20abe107fc58a7e70e62005 not found: ID does not exist" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.443363 4764 scope.go:117] "RemoveContainer" containerID="3bb9ec21c6f1c0b029f4ecbaaaaa3cd7a2a99b6ff1dcb8f0a53735adf1c84e7e" Oct 01 17:01:34 crc kubenswrapper[4764]: E1001 17:01:34.443607 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bb9ec21c6f1c0b029f4ecbaaaaa3cd7a2a99b6ff1dcb8f0a53735adf1c84e7e\": container with ID starting with 3bb9ec21c6f1c0b029f4ecbaaaaa3cd7a2a99b6ff1dcb8f0a53735adf1c84e7e not found: ID does not exist" containerID="3bb9ec21c6f1c0b029f4ecbaaaaa3cd7a2a99b6ff1dcb8f0a53735adf1c84e7e" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.443632 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bb9ec21c6f1c0b029f4ecbaaaaa3cd7a2a99b6ff1dcb8f0a53735adf1c84e7e"} err="failed to get container status \"3bb9ec21c6f1c0b029f4ecbaaaaa3cd7a2a99b6ff1dcb8f0a53735adf1c84e7e\": rpc error: code = NotFound desc = could not find container \"3bb9ec21c6f1c0b029f4ecbaaaaa3cd7a2a99b6ff1dcb8f0a53735adf1c84e7e\": container with ID starting with 3bb9ec21c6f1c0b029f4ecbaaaaa3cd7a2a99b6ff1dcb8f0a53735adf1c84e7e not found: ID does not exist" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.502399 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c1f72b-eff5-433e-be34-554e2327106c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48c1f72b-eff5-433e-be34-554e2327106c" (UID: "48c1f72b-eff5-433e-be34-554e2327106c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.590634 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48c1f72b-eff5-433e-be34-554e2327106c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.678162 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jtvhp"] Oct 01 17:01:34 crc kubenswrapper[4764]: I1001 17:01:34.689590 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jtvhp"] Oct 01 17:01:35 crc kubenswrapper[4764]: I1001 17:01:35.741493 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c1f72b-eff5-433e-be34-554e2327106c" path="/var/lib/kubelet/pods/48c1f72b-eff5-433e-be34-554e2327106c/volumes" Oct 01 17:01:51 crc kubenswrapper[4764]: I1001 17:01:51.913792 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:01:51 crc kubenswrapper[4764]: I1001 17:01:51.914391 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 17:01:53 crc kubenswrapper[4764]: I1001 17:01:53.807564 4764 scope.go:117] "RemoveContainer" containerID="4e9e3b2c976d63754df1aa1411440231cac6f40aea8592e89bf8193095bcfd0a" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.095621 4764 scope.go:117] "RemoveContainer" containerID="6ee5d01f3e771b82e6710b06dbe360e8dd9d52a199280b316de10e34177675c7" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.138519 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lqhm5"] Oct 01 17:01:54 crc kubenswrapper[4764]: E1001 17:01:54.138893 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c1f72b-eff5-433e-be34-554e2327106c" containerName="extract-utilities" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.138906 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c1f72b-eff5-433e-be34-554e2327106c" containerName="extract-utilities" Oct 01 17:01:54 crc kubenswrapper[4764]: E1001 17:01:54.138926 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c1f72b-eff5-433e-be34-554e2327106c" containerName="extract-content" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.138933 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c1f72b-eff5-433e-be34-554e2327106c" containerName="extract-content" Oct 01 17:01:54 crc kubenswrapper[4764]: E1001 17:01:54.138942 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c1f72b-eff5-433e-be34-554e2327106c" containerName="registry-server" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.138949 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c1f72b-eff5-433e-be34-554e2327106c" containerName="registry-server" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.139163 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c1f72b-eff5-433e-be34-554e2327106c" containerName="registry-server" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.140408 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.156229 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqhm5"] Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.185488 4764 scope.go:117] "RemoveContainer" containerID="00b0ce2e9c7a0ee4bd98d8086d80e972394da2284f9823baa3132bcee98bce80" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.211681 4764 scope.go:117] "RemoveContainer" containerID="2d7c85ff30844100ee9e0edf53bb3971f83baa32659144e1d89708acae81b02d" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.233932 4764 scope.go:117] "RemoveContainer" containerID="e0228c4449e380c4c919ba5daf54fc18684d6af86252d2ffe795a01c0d656bc2" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.287410 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dvft\" (UniqueName: \"kubernetes.io/projected/158fc9d7-20c7-422d-9d8b-85cc60cdb563-kube-api-access-4dvft\") pod \"redhat-marketplace-lqhm5\" (UID: \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\") " pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.287510 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158fc9d7-20c7-422d-9d8b-85cc60cdb563-catalog-content\") pod \"redhat-marketplace-lqhm5\" (UID: \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\") " pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.287553 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158fc9d7-20c7-422d-9d8b-85cc60cdb563-utilities\") pod \"redhat-marketplace-lqhm5\" (UID: \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\") " pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.326986 4764 scope.go:117] "RemoveContainer" containerID="268a56a3776ad5abf45ff01451bfaf631cdffa6b439721b5a272f55e3321d8d9" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.388948 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dvft\" (UniqueName: \"kubernetes.io/projected/158fc9d7-20c7-422d-9d8b-85cc60cdb563-kube-api-access-4dvft\") pod \"redhat-marketplace-lqhm5\" (UID: \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\") " pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.389455 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158fc9d7-20c7-422d-9d8b-85cc60cdb563-catalog-content\") pod \"redhat-marketplace-lqhm5\" (UID: \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\") " pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.389862 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158fc9d7-20c7-422d-9d8b-85cc60cdb563-utilities\") pod \"redhat-marketplace-lqhm5\" (UID: \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\") " pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.477471 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158fc9d7-20c7-422d-9d8b-85cc60cdb563-catalog-content\") pod \"redhat-marketplace-lqhm5\" (UID: \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\") " pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.477544 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158fc9d7-20c7-422d-9d8b-85cc60cdb563-utilities\") pod \"redhat-marketplace-lqhm5\" (UID: \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\") " pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.478185 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dvft\" (UniqueName: \"kubernetes.io/projected/158fc9d7-20c7-422d-9d8b-85cc60cdb563-kube-api-access-4dvft\") pod \"redhat-marketplace-lqhm5\" (UID: \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\") " pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:01:54 crc kubenswrapper[4764]: I1001 17:01:54.762423 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:01:55 crc kubenswrapper[4764]: I1001 17:01:55.280375 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqhm5"] Oct 01 17:01:55 crc kubenswrapper[4764]: I1001 17:01:55.547997 4764 generic.go:334] "Generic (PLEG): container finished" podID="158fc9d7-20c7-422d-9d8b-85cc60cdb563" containerID="eb8d3bef90554d67904f20b959dddc787732f99b748addb2434d8bb01d8d1f46" exitCode=0 Oct 01 17:01:55 crc kubenswrapper[4764]: I1001 17:01:55.548077 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqhm5" event={"ID":"158fc9d7-20c7-422d-9d8b-85cc60cdb563","Type":"ContainerDied","Data":"eb8d3bef90554d67904f20b959dddc787732f99b748addb2434d8bb01d8d1f46"} Oct 01 17:01:55 crc kubenswrapper[4764]: I1001 17:01:55.548328 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqhm5" event={"ID":"158fc9d7-20c7-422d-9d8b-85cc60cdb563","Type":"ContainerStarted","Data":"2c3173899e0b00eab0980a1f3831e60b299c067479c6b1dd2d06d105be04e810"} Oct 01 17:01:57 crc kubenswrapper[4764]: I1001 17:01:57.571796 4764 generic.go:334] "Generic (PLEG): container finished" podID="158fc9d7-20c7-422d-9d8b-85cc60cdb563" containerID="09855e75790422628bc87d4caf2d97e3b26e27a620a10621923cd435c77a9446" exitCode=0 Oct 01 17:01:57 crc kubenswrapper[4764]: I1001 17:01:57.571845 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqhm5" event={"ID":"158fc9d7-20c7-422d-9d8b-85cc60cdb563","Type":"ContainerDied","Data":"09855e75790422628bc87d4caf2d97e3b26e27a620a10621923cd435c77a9446"} Oct 01 17:01:58 crc kubenswrapper[4764]: I1001 17:01:58.583319 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqhm5" event={"ID":"158fc9d7-20c7-422d-9d8b-85cc60cdb563","Type":"ContainerStarted","Data":"842cdd9e06d4c3ccaa456b5ec1e1da695e731ce9037a52a13e4ee65b4086ab46"} Oct 01 17:01:58 crc kubenswrapper[4764]: I1001 17:01:58.604732 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lqhm5" podStartSLOduration=1.9057161759999999 podStartE2EDuration="4.604709596s" podCreationTimestamp="2025-10-01 17:01:54 +0000 UTC" firstStartedPulling="2025-10-01 17:01:55.54961824 +0000 UTC m=+3578.549265075" lastFinishedPulling="2025-10-01 17:01:58.24861166 +0000 UTC m=+3581.248258495" observedRunningTime="2025-10-01 17:01:58.600984075 +0000 UTC m=+3581.600630910" watchObservedRunningTime="2025-10-01 17:01:58.604709596 +0000 UTC m=+3581.604356431" Oct 01 17:02:04 crc kubenswrapper[4764]: I1001 17:02:04.763612 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:02:04 crc kubenswrapper[4764]: I1001 17:02:04.764353 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:02:04 crc kubenswrapper[4764]: I1001 17:02:04.843891 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:02:05 crc kubenswrapper[4764]: I1001 17:02:05.696312 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:02:05 crc kubenswrapper[4764]: I1001 17:02:05.754645 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqhm5"] Oct 01 17:02:07 crc kubenswrapper[4764]: I1001 17:02:07.666610 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lqhm5" podUID="158fc9d7-20c7-422d-9d8b-85cc60cdb563" containerName="registry-server" containerID="cri-o://842cdd9e06d4c3ccaa456b5ec1e1da695e731ce9037a52a13e4ee65b4086ab46" gracePeriod=2 Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.265684 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.375668 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dvft\" (UniqueName: \"kubernetes.io/projected/158fc9d7-20c7-422d-9d8b-85cc60cdb563-kube-api-access-4dvft\") pod \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\" (UID: \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\") " Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.375758 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158fc9d7-20c7-422d-9d8b-85cc60cdb563-catalog-content\") pod \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\" (UID: \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\") " Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.375992 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158fc9d7-20c7-422d-9d8b-85cc60cdb563-utilities\") pod \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\" (UID: \"158fc9d7-20c7-422d-9d8b-85cc60cdb563\") " Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.377120 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158fc9d7-20c7-422d-9d8b-85cc60cdb563-utilities" (OuterVolumeSpecName: "utilities") pod "158fc9d7-20c7-422d-9d8b-85cc60cdb563" (UID: "158fc9d7-20c7-422d-9d8b-85cc60cdb563"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.382243 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158fc9d7-20c7-422d-9d8b-85cc60cdb563-kube-api-access-4dvft" (OuterVolumeSpecName: "kube-api-access-4dvft") pod "158fc9d7-20c7-422d-9d8b-85cc60cdb563" (UID: "158fc9d7-20c7-422d-9d8b-85cc60cdb563"). InnerVolumeSpecName "kube-api-access-4dvft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.388763 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158fc9d7-20c7-422d-9d8b-85cc60cdb563-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "158fc9d7-20c7-422d-9d8b-85cc60cdb563" (UID: "158fc9d7-20c7-422d-9d8b-85cc60cdb563"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.478466 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dvft\" (UniqueName: \"kubernetes.io/projected/158fc9d7-20c7-422d-9d8b-85cc60cdb563-kube-api-access-4dvft\") on node \"crc\" DevicePath \"\"" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.478498 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158fc9d7-20c7-422d-9d8b-85cc60cdb563-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.478510 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158fc9d7-20c7-422d-9d8b-85cc60cdb563-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.678457 4764 generic.go:334] "Generic (PLEG): container finished" podID="158fc9d7-20c7-422d-9d8b-85cc60cdb563" containerID="842cdd9e06d4c3ccaa456b5ec1e1da695e731ce9037a52a13e4ee65b4086ab46" exitCode=0 Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.678512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqhm5" event={"ID":"158fc9d7-20c7-422d-9d8b-85cc60cdb563","Type":"ContainerDied","Data":"842cdd9e06d4c3ccaa456b5ec1e1da695e731ce9037a52a13e4ee65b4086ab46"} Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.678546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lqhm5" event={"ID":"158fc9d7-20c7-422d-9d8b-85cc60cdb563","Type":"ContainerDied","Data":"2c3173899e0b00eab0980a1f3831e60b299c067479c6b1dd2d06d105be04e810"} Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.678567 4764 scope.go:117] "RemoveContainer" containerID="842cdd9e06d4c3ccaa456b5ec1e1da695e731ce9037a52a13e4ee65b4086ab46" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.678654 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lqhm5" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.711818 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqhm5"] Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.714203 4764 scope.go:117] "RemoveContainer" containerID="09855e75790422628bc87d4caf2d97e3b26e27a620a10621923cd435c77a9446" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.719572 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lqhm5"] Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.734225 4764 scope.go:117] "RemoveContainer" containerID="eb8d3bef90554d67904f20b959dddc787732f99b748addb2434d8bb01d8d1f46" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.785319 4764 scope.go:117] "RemoveContainer" containerID="842cdd9e06d4c3ccaa456b5ec1e1da695e731ce9037a52a13e4ee65b4086ab46" Oct 01 17:02:08 crc kubenswrapper[4764]: E1001 17:02:08.787245 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"842cdd9e06d4c3ccaa456b5ec1e1da695e731ce9037a52a13e4ee65b4086ab46\": container with ID starting with 842cdd9e06d4c3ccaa456b5ec1e1da695e731ce9037a52a13e4ee65b4086ab46 not found: ID does not exist" containerID="842cdd9e06d4c3ccaa456b5ec1e1da695e731ce9037a52a13e4ee65b4086ab46" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.787287 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"842cdd9e06d4c3ccaa456b5ec1e1da695e731ce9037a52a13e4ee65b4086ab46"} err="failed to get container status \"842cdd9e06d4c3ccaa456b5ec1e1da695e731ce9037a52a13e4ee65b4086ab46\": rpc error: code = NotFound desc = could not find container \"842cdd9e06d4c3ccaa456b5ec1e1da695e731ce9037a52a13e4ee65b4086ab46\": container with ID starting with 842cdd9e06d4c3ccaa456b5ec1e1da695e731ce9037a52a13e4ee65b4086ab46 not found: ID does not exist" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.787316 4764 scope.go:117] "RemoveContainer" containerID="09855e75790422628bc87d4caf2d97e3b26e27a620a10621923cd435c77a9446" Oct 01 17:02:08 crc kubenswrapper[4764]: E1001 17:02:08.787749 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09855e75790422628bc87d4caf2d97e3b26e27a620a10621923cd435c77a9446\": container with ID starting with 09855e75790422628bc87d4caf2d97e3b26e27a620a10621923cd435c77a9446 not found: ID does not exist" containerID="09855e75790422628bc87d4caf2d97e3b26e27a620a10621923cd435c77a9446" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.787862 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09855e75790422628bc87d4caf2d97e3b26e27a620a10621923cd435c77a9446"} err="failed to get container status \"09855e75790422628bc87d4caf2d97e3b26e27a620a10621923cd435c77a9446\": rpc error: code = NotFound desc = could not find container \"09855e75790422628bc87d4caf2d97e3b26e27a620a10621923cd435c77a9446\": container with ID starting with 09855e75790422628bc87d4caf2d97e3b26e27a620a10621923cd435c77a9446 not found: ID does not exist" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.787952 4764 scope.go:117] "RemoveContainer" containerID="eb8d3bef90554d67904f20b959dddc787732f99b748addb2434d8bb01d8d1f46" Oct 01 17:02:08 crc kubenswrapper[4764]: E1001 17:02:08.788273 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb8d3bef90554d67904f20b959dddc787732f99b748addb2434d8bb01d8d1f46\": container with ID starting with eb8d3bef90554d67904f20b959dddc787732f99b748addb2434d8bb01d8d1f46 not found: ID does not exist" containerID="eb8d3bef90554d67904f20b959dddc787732f99b748addb2434d8bb01d8d1f46" Oct 01 17:02:08 crc kubenswrapper[4764]: I1001 17:02:08.788386 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8d3bef90554d67904f20b959dddc787732f99b748addb2434d8bb01d8d1f46"} err="failed to get container status \"eb8d3bef90554d67904f20b959dddc787732f99b748addb2434d8bb01d8d1f46\": rpc error: code = NotFound desc = could not find container \"eb8d3bef90554d67904f20b959dddc787732f99b748addb2434d8bb01d8d1f46\": container with ID starting with eb8d3bef90554d67904f20b959dddc787732f99b748addb2434d8bb01d8d1f46 not found: ID does not exist" Oct 01 17:02:08 crc kubenswrapper[4764]: E1001 17:02:08.872680 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158fc9d7_20c7_422d_9d8b_85cc60cdb563.slice\": RecentStats: unable to find data in memory cache]" Oct 01 17:02:09 crc kubenswrapper[4764]: I1001 17:02:09.738760 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158fc9d7-20c7-422d-9d8b-85cc60cdb563" path="/var/lib/kubelet/pods/158fc9d7-20c7-422d-9d8b-85cc60cdb563/volumes" Oct 01 17:02:21 crc kubenswrapper[4764]: I1001 17:02:21.914076 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:02:21 crc kubenswrapper[4764]: I1001 17:02:21.914687 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.489387 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qrxvt"] Oct 01 17:02:24 crc kubenswrapper[4764]: E1001 17:02:24.491199 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158fc9d7-20c7-422d-9d8b-85cc60cdb563" containerName="extract-content" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.491292 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="158fc9d7-20c7-422d-9d8b-85cc60cdb563" containerName="extract-content" Oct 01 17:02:24 crc kubenswrapper[4764]: E1001 17:02:24.491377 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158fc9d7-20c7-422d-9d8b-85cc60cdb563" containerName="extract-utilities" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.491444 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="158fc9d7-20c7-422d-9d8b-85cc60cdb563" containerName="extract-utilities" Oct 01 17:02:24 crc kubenswrapper[4764]: E1001 17:02:24.491522 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158fc9d7-20c7-422d-9d8b-85cc60cdb563" containerName="registry-server" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.491586 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="158fc9d7-20c7-422d-9d8b-85cc60cdb563" containerName="registry-server" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.491858 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="158fc9d7-20c7-422d-9d8b-85cc60cdb563" containerName="registry-server" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.493345 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.505263 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrxvt"] Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.609229 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f287a53c-85ee-4dba-aecb-53d477c2db09-utilities\") pod \"certified-operators-qrxvt\" (UID: \"f287a53c-85ee-4dba-aecb-53d477c2db09\") " pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.609524 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f287a53c-85ee-4dba-aecb-53d477c2db09-catalog-content\") pod \"certified-operators-qrxvt\" (UID: \"f287a53c-85ee-4dba-aecb-53d477c2db09\") " pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.609678 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct8cb\" (UniqueName: \"kubernetes.io/projected/f287a53c-85ee-4dba-aecb-53d477c2db09-kube-api-access-ct8cb\") pod \"certified-operators-qrxvt\" (UID: \"f287a53c-85ee-4dba-aecb-53d477c2db09\") " pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.711956 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f287a53c-85ee-4dba-aecb-53d477c2db09-utilities\") pod \"certified-operators-qrxvt\" (UID: \"f287a53c-85ee-4dba-aecb-53d477c2db09\") " pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.712014 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f287a53c-85ee-4dba-aecb-53d477c2db09-catalog-content\") pod \"certified-operators-qrxvt\" (UID: \"f287a53c-85ee-4dba-aecb-53d477c2db09\") " pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.712136 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct8cb\" (UniqueName: \"kubernetes.io/projected/f287a53c-85ee-4dba-aecb-53d477c2db09-kube-api-access-ct8cb\") pod \"certified-operators-qrxvt\" (UID: \"f287a53c-85ee-4dba-aecb-53d477c2db09\") " pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.712576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f287a53c-85ee-4dba-aecb-53d477c2db09-utilities\") pod \"certified-operators-qrxvt\" (UID: \"f287a53c-85ee-4dba-aecb-53d477c2db09\") " pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.712730 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f287a53c-85ee-4dba-aecb-53d477c2db09-catalog-content\") pod \"certified-operators-qrxvt\" (UID: \"f287a53c-85ee-4dba-aecb-53d477c2db09\") " pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.732826 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct8cb\" (UniqueName: \"kubernetes.io/projected/f287a53c-85ee-4dba-aecb-53d477c2db09-kube-api-access-ct8cb\") pod \"certified-operators-qrxvt\" (UID: \"f287a53c-85ee-4dba-aecb-53d477c2db09\") " pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:24 crc kubenswrapper[4764]: I1001 17:02:24.821758 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:25 crc kubenswrapper[4764]: W1001 17:02:25.827425 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf287a53c_85ee_4dba_aecb_53d477c2db09.slice/crio-c1e3822f905fb6ac8f6a4c99a4d31bc8ebdf488d4d2b139d1807b9caabd33064 WatchSource:0}: Error finding container c1e3822f905fb6ac8f6a4c99a4d31bc8ebdf488d4d2b139d1807b9caabd33064: Status 404 returned error can't find the container with id c1e3822f905fb6ac8f6a4c99a4d31bc8ebdf488d4d2b139d1807b9caabd33064 Oct 01 17:02:25 crc kubenswrapper[4764]: I1001 17:02:25.829375 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qrxvt"] Oct 01 17:02:25 crc kubenswrapper[4764]: I1001 17:02:25.849763 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrxvt" event={"ID":"f287a53c-85ee-4dba-aecb-53d477c2db09","Type":"ContainerStarted","Data":"c1e3822f905fb6ac8f6a4c99a4d31bc8ebdf488d4d2b139d1807b9caabd33064"} Oct 01 17:02:26 crc kubenswrapper[4764]: I1001 17:02:26.866236 4764 generic.go:334] "Generic (PLEG): container finished" podID="f287a53c-85ee-4dba-aecb-53d477c2db09" containerID="b420d2c82b93cf7c25515db896ca8118ef692cd56ec33cd5ec60c22130d2f74e" exitCode=0 Oct 01 17:02:26 crc kubenswrapper[4764]: I1001 17:02:26.866310 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrxvt" event={"ID":"f287a53c-85ee-4dba-aecb-53d477c2db09","Type":"ContainerDied","Data":"b420d2c82b93cf7c25515db896ca8118ef692cd56ec33cd5ec60c22130d2f74e"} Oct 01 17:02:26 crc kubenswrapper[4764]: I1001 17:02:26.869903 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 17:02:28 crc kubenswrapper[4764]: I1001 17:02:28.896188 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrxvt" event={"ID":"f287a53c-85ee-4dba-aecb-53d477c2db09","Type":"ContainerStarted","Data":"2d76f56a857cb051ffdb3fe6d2b7b5018e5a169764a1a6a8df225c10486d5907"} Oct 01 17:02:30 crc kubenswrapper[4764]: I1001 17:02:30.915349 4764 generic.go:334] "Generic (PLEG): container finished" podID="f287a53c-85ee-4dba-aecb-53d477c2db09" containerID="2d76f56a857cb051ffdb3fe6d2b7b5018e5a169764a1a6a8df225c10486d5907" exitCode=0 Oct 01 17:02:30 crc kubenswrapper[4764]: I1001 17:02:30.915428 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrxvt" event={"ID":"f287a53c-85ee-4dba-aecb-53d477c2db09","Type":"ContainerDied","Data":"2d76f56a857cb051ffdb3fe6d2b7b5018e5a169764a1a6a8df225c10486d5907"} Oct 01 17:02:31 crc kubenswrapper[4764]: I1001 17:02:31.927170 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrxvt" event={"ID":"f287a53c-85ee-4dba-aecb-53d477c2db09","Type":"ContainerStarted","Data":"90aed7d5f2c2295a89b2288aa9952778e6d37a83635d0d6ced63a9797ce3a8d6"} Oct 01 17:02:31 crc kubenswrapper[4764]: I1001 17:02:31.955263 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qrxvt" podStartSLOduration=3.169585249 podStartE2EDuration="7.955240031s" podCreationTimestamp="2025-10-01 17:02:24 +0000 UTC" firstStartedPulling="2025-10-01 17:02:26.869559491 +0000 UTC m=+3609.869206326" lastFinishedPulling="2025-10-01 17:02:31.655214273 +0000 UTC m=+3614.654861108" observedRunningTime="2025-10-01 17:02:31.945455961 +0000 UTC m=+3614.945102826" watchObservedRunningTime="2025-10-01 17:02:31.955240031 +0000 UTC m=+3614.954886866" Oct 01 17:02:34 crc kubenswrapper[4764]: I1001 17:02:34.822435 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:34 crc kubenswrapper[4764]: I1001 17:02:34.822915 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:34 crc kubenswrapper[4764]: I1001 17:02:34.898327 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:44 crc kubenswrapper[4764]: I1001 17:02:44.897997 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:44 crc kubenswrapper[4764]: I1001 17:02:44.950075 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrxvt"] Oct 01 17:02:45 crc kubenswrapper[4764]: I1001 17:02:45.055239 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qrxvt" podUID="f287a53c-85ee-4dba-aecb-53d477c2db09" containerName="registry-server" containerID="cri-o://90aed7d5f2c2295a89b2288aa9952778e6d37a83635d0d6ced63a9797ce3a8d6" gracePeriod=2 Oct 01 17:02:45 crc kubenswrapper[4764]: I1001 17:02:45.529222 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:45 crc kubenswrapper[4764]: I1001 17:02:45.718180 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f287a53c-85ee-4dba-aecb-53d477c2db09-catalog-content\") pod \"f287a53c-85ee-4dba-aecb-53d477c2db09\" (UID: \"f287a53c-85ee-4dba-aecb-53d477c2db09\") " Oct 01 17:02:45 crc kubenswrapper[4764]: I1001 17:02:45.718258 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f287a53c-85ee-4dba-aecb-53d477c2db09-utilities\") pod \"f287a53c-85ee-4dba-aecb-53d477c2db09\" (UID: \"f287a53c-85ee-4dba-aecb-53d477c2db09\") " Oct 01 17:02:45 crc kubenswrapper[4764]: I1001 17:02:45.718375 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct8cb\" (UniqueName: \"kubernetes.io/projected/f287a53c-85ee-4dba-aecb-53d477c2db09-kube-api-access-ct8cb\") pod \"f287a53c-85ee-4dba-aecb-53d477c2db09\" (UID: \"f287a53c-85ee-4dba-aecb-53d477c2db09\") " Oct 01 17:02:45 crc kubenswrapper[4764]: I1001 17:02:45.720560 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f287a53c-85ee-4dba-aecb-53d477c2db09-utilities" (OuterVolumeSpecName: "utilities") pod "f287a53c-85ee-4dba-aecb-53d477c2db09" (UID: "f287a53c-85ee-4dba-aecb-53d477c2db09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:02:45 crc kubenswrapper[4764]: I1001 17:02:45.726428 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f287a53c-85ee-4dba-aecb-53d477c2db09-kube-api-access-ct8cb" (OuterVolumeSpecName: "kube-api-access-ct8cb") pod "f287a53c-85ee-4dba-aecb-53d477c2db09" (UID: "f287a53c-85ee-4dba-aecb-53d477c2db09"). InnerVolumeSpecName "kube-api-access-ct8cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:02:45 crc kubenswrapper[4764]: I1001 17:02:45.762015 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f287a53c-85ee-4dba-aecb-53d477c2db09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f287a53c-85ee-4dba-aecb-53d477c2db09" (UID: "f287a53c-85ee-4dba-aecb-53d477c2db09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:02:45 crc kubenswrapper[4764]: I1001 17:02:45.820686 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f287a53c-85ee-4dba-aecb-53d477c2db09-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 17:02:45 crc kubenswrapper[4764]: I1001 17:02:45.820718 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f287a53c-85ee-4dba-aecb-53d477c2db09-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 17:02:45 crc kubenswrapper[4764]: I1001 17:02:45.820728 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct8cb\" (UniqueName: \"kubernetes.io/projected/f287a53c-85ee-4dba-aecb-53d477c2db09-kube-api-access-ct8cb\") on node \"crc\" DevicePath \"\"" Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.066288 4764 generic.go:334] "Generic (PLEG): container finished" podID="f287a53c-85ee-4dba-aecb-53d477c2db09" containerID="90aed7d5f2c2295a89b2288aa9952778e6d37a83635d0d6ced63a9797ce3a8d6" exitCode=0 Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.066346 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrxvt" event={"ID":"f287a53c-85ee-4dba-aecb-53d477c2db09","Type":"ContainerDied","Data":"90aed7d5f2c2295a89b2288aa9952778e6d37a83635d0d6ced63a9797ce3a8d6"} Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.066378 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qrxvt" event={"ID":"f287a53c-85ee-4dba-aecb-53d477c2db09","Type":"ContainerDied","Data":"c1e3822f905fb6ac8f6a4c99a4d31bc8ebdf488d4d2b139d1807b9caabd33064"} Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.066395 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qrxvt" Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.066412 4764 scope.go:117] "RemoveContainer" containerID="90aed7d5f2c2295a89b2288aa9952778e6d37a83635d0d6ced63a9797ce3a8d6" Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.101023 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qrxvt"] Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.110113 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qrxvt"] Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.111649 4764 scope.go:117] "RemoveContainer" containerID="2d76f56a857cb051ffdb3fe6d2b7b5018e5a169764a1a6a8df225c10486d5907" Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.139299 4764 scope.go:117] "RemoveContainer" containerID="b420d2c82b93cf7c25515db896ca8118ef692cd56ec33cd5ec60c22130d2f74e" Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.180182 4764 scope.go:117] "RemoveContainer" containerID="90aed7d5f2c2295a89b2288aa9952778e6d37a83635d0d6ced63a9797ce3a8d6" Oct 01 17:02:46 crc kubenswrapper[4764]: E1001 17:02:46.180847 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90aed7d5f2c2295a89b2288aa9952778e6d37a83635d0d6ced63a9797ce3a8d6\": container with ID starting with 90aed7d5f2c2295a89b2288aa9952778e6d37a83635d0d6ced63a9797ce3a8d6 not found: ID does not exist" containerID="90aed7d5f2c2295a89b2288aa9952778e6d37a83635d0d6ced63a9797ce3a8d6" Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.180950 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90aed7d5f2c2295a89b2288aa9952778e6d37a83635d0d6ced63a9797ce3a8d6"} err="failed to get container status \"90aed7d5f2c2295a89b2288aa9952778e6d37a83635d0d6ced63a9797ce3a8d6\": rpc error: code = NotFound desc = could not find container \"90aed7d5f2c2295a89b2288aa9952778e6d37a83635d0d6ced63a9797ce3a8d6\": container with ID starting with 90aed7d5f2c2295a89b2288aa9952778e6d37a83635d0d6ced63a9797ce3a8d6 not found: ID does not exist" Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.181034 4764 scope.go:117] "RemoveContainer" containerID="2d76f56a857cb051ffdb3fe6d2b7b5018e5a169764a1a6a8df225c10486d5907" Oct 01 17:02:46 crc kubenswrapper[4764]: E1001 17:02:46.181514 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d76f56a857cb051ffdb3fe6d2b7b5018e5a169764a1a6a8df225c10486d5907\": container with ID starting with 2d76f56a857cb051ffdb3fe6d2b7b5018e5a169764a1a6a8df225c10486d5907 not found: ID does not exist" containerID="2d76f56a857cb051ffdb3fe6d2b7b5018e5a169764a1a6a8df225c10486d5907" Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.181563 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d76f56a857cb051ffdb3fe6d2b7b5018e5a169764a1a6a8df225c10486d5907"} err="failed to get container status \"2d76f56a857cb051ffdb3fe6d2b7b5018e5a169764a1a6a8df225c10486d5907\": rpc error: code = NotFound desc = could not find container \"2d76f56a857cb051ffdb3fe6d2b7b5018e5a169764a1a6a8df225c10486d5907\": container with ID starting with 2d76f56a857cb051ffdb3fe6d2b7b5018e5a169764a1a6a8df225c10486d5907 not found: ID does not exist" Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.181595 4764 scope.go:117] "RemoveContainer" containerID="b420d2c82b93cf7c25515db896ca8118ef692cd56ec33cd5ec60c22130d2f74e" Oct 01 17:02:46 crc kubenswrapper[4764]: E1001 17:02:46.182019 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b420d2c82b93cf7c25515db896ca8118ef692cd56ec33cd5ec60c22130d2f74e\": container with ID starting with b420d2c82b93cf7c25515db896ca8118ef692cd56ec33cd5ec60c22130d2f74e not found: ID does not exist" containerID="b420d2c82b93cf7c25515db896ca8118ef692cd56ec33cd5ec60c22130d2f74e" Oct 01 17:02:46 crc kubenswrapper[4764]: I1001 17:02:46.182264 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b420d2c82b93cf7c25515db896ca8118ef692cd56ec33cd5ec60c22130d2f74e"} err="failed to get container status \"b420d2c82b93cf7c25515db896ca8118ef692cd56ec33cd5ec60c22130d2f74e\": rpc error: code = NotFound desc = could not find container \"b420d2c82b93cf7c25515db896ca8118ef692cd56ec33cd5ec60c22130d2f74e\": container with ID starting with b420d2c82b93cf7c25515db896ca8118ef692cd56ec33cd5ec60c22130d2f74e not found: ID does not exist" Oct 01 17:02:47 crc kubenswrapper[4764]: I1001 17:02:47.736823 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f287a53c-85ee-4dba-aecb-53d477c2db09" path="/var/lib/kubelet/pods/f287a53c-85ee-4dba-aecb-53d477c2db09/volumes" Oct 01 17:02:51 crc kubenswrapper[4764]: I1001 17:02:51.914106 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:02:51 crc kubenswrapper[4764]: I1001 17:02:51.914726 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 17:02:51 crc kubenswrapper[4764]: I1001 17:02:51.914790 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 17:02:51 crc kubenswrapper[4764]: I1001 17:02:51.915501 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e70823572738d7495a643f986b0c30a203fe5ac2c2c10b874727a1af42da7b92"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 17:02:51 crc kubenswrapper[4764]: I1001 17:02:51.915546 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://e70823572738d7495a643f986b0c30a203fe5ac2c2c10b874727a1af42da7b92" gracePeriod=600 Oct 01 17:02:52 crc kubenswrapper[4764]: I1001 17:02:52.122776 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="e70823572738d7495a643f986b0c30a203fe5ac2c2c10b874727a1af42da7b92" exitCode=0 Oct 01 17:02:52 crc kubenswrapper[4764]: I1001 17:02:52.122845 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"e70823572738d7495a643f986b0c30a203fe5ac2c2c10b874727a1af42da7b92"} Oct 01 17:02:52 crc kubenswrapper[4764]: I1001 17:02:52.123095 4764 scope.go:117] "RemoveContainer" containerID="19f2e7284c0cb3c14795de5b01f41a19b60f130132cf6e8b7bf5dc84247c1d7a" Oct 01 17:02:53 crc kubenswrapper[4764]: I1001 17:02:53.134177 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99"} Oct 01 17:04:32 crc kubenswrapper[4764]: I1001 17:04:32.038338 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-xzsgk"] Oct 01 17:04:32 crc kubenswrapper[4764]: I1001 17:04:32.051783 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-xzsgk"] Oct 01 17:04:33 crc kubenswrapper[4764]: I1001 17:04:33.744076 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="372fcf2a-4cac-4506-b875-aada21327f29" path="/var/lib/kubelet/pods/372fcf2a-4cac-4506-b875-aada21327f29/volumes" Oct 01 17:04:55 crc kubenswrapper[4764]: I1001 17:04:55.093309 4764 scope.go:117] "RemoveContainer" containerID="9b679cf0b93bea3fce5f3ae4f8ea91361e8c41366e268cb8007b03a271f88744" Oct 01 17:05:00 crc kubenswrapper[4764]: I1001 17:05:00.041636 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-5680-account-create-rcvwg"] Oct 01 17:05:00 crc kubenswrapper[4764]: I1001 17:05:00.049932 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-5680-account-create-rcvwg"] Oct 01 17:05:01 crc kubenswrapper[4764]: I1001 17:05:01.736827 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b" path="/var/lib/kubelet/pods/d9fc155d-1668-42dd-8db9-9ae2b8fa2a4b/volumes" Oct 01 17:05:21 crc kubenswrapper[4764]: I1001 17:05:21.913938 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:05:21 crc kubenswrapper[4764]: I1001 17:05:21.914535 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 17:05:26 crc kubenswrapper[4764]: I1001 17:05:26.063650 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-lcwlv"] Oct 01 17:05:26 crc kubenswrapper[4764]: I1001 17:05:26.076027 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-lcwlv"] Oct 01 17:05:27 crc kubenswrapper[4764]: I1001 17:05:27.736867 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0e16d19-9b17-40a6-91ff-9572ef612fa3" path="/var/lib/kubelet/pods/d0e16d19-9b17-40a6-91ff-9572ef612fa3/volumes" Oct 01 17:05:51 crc kubenswrapper[4764]: I1001 17:05:51.913812 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:05:51 crc kubenswrapper[4764]: I1001 17:05:51.914467 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 17:05:55 crc kubenswrapper[4764]: I1001 17:05:55.173335 4764 scope.go:117] "RemoveContainer" containerID="ccdce9c96c62945a607316ee9a35853031bfb486442bc83bbeb25999ca021519" Oct 01 17:05:55 crc kubenswrapper[4764]: I1001 17:05:55.259062 4764 scope.go:117] "RemoveContainer" containerID="bd0d85c512fde3baa853a6fc3609290185a9ae0c8ca0d29b73d35e637c305c03" Oct 01 17:06:21 crc kubenswrapper[4764]: I1001 17:06:21.913853 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:06:21 crc kubenswrapper[4764]: I1001 17:06:21.914498 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 17:06:21 crc kubenswrapper[4764]: I1001 17:06:21.914555 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 17:06:21 crc kubenswrapper[4764]: I1001 17:06:21.915453 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 17:06:21 crc kubenswrapper[4764]: I1001 17:06:21.915526 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" gracePeriod=600 Oct 01 17:06:22 crc kubenswrapper[4764]: E1001 17:06:22.104484 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:06:22 crc kubenswrapper[4764]: I1001 17:06:22.129518 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" exitCode=0 Oct 01 17:06:22 crc kubenswrapper[4764]: I1001 17:06:22.129563 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99"} Oct 01 17:06:22 crc kubenswrapper[4764]: I1001 17:06:22.129597 4764 scope.go:117] "RemoveContainer" containerID="e70823572738d7495a643f986b0c30a203fe5ac2c2c10b874727a1af42da7b92" Oct 01 17:06:22 crc kubenswrapper[4764]: I1001 17:06:22.130336 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:06:22 crc kubenswrapper[4764]: E1001 17:06:22.130579 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:06:36 crc kubenswrapper[4764]: I1001 17:06:36.722512 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:06:36 crc kubenswrapper[4764]: E1001 17:06:36.723226 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:06:51 crc kubenswrapper[4764]: I1001 17:06:51.722479 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:06:51 crc kubenswrapper[4764]: E1001 17:06:51.723534 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:07:02 crc kubenswrapper[4764]: I1001 17:07:02.721882 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:07:02 crc kubenswrapper[4764]: E1001 17:07:02.722714 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:07:16 crc kubenswrapper[4764]: I1001 17:07:16.722172 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:07:16 crc kubenswrapper[4764]: E1001 17:07:16.722931 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:07:31 crc kubenswrapper[4764]: I1001 17:07:31.722280 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:07:31 crc kubenswrapper[4764]: E1001 17:07:31.723427 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:07:42 crc kubenswrapper[4764]: I1001 17:07:42.722129 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:07:42 crc kubenswrapper[4764]: E1001 17:07:42.722813 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:07:56 crc kubenswrapper[4764]: I1001 17:07:56.722571 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:07:56 crc kubenswrapper[4764]: E1001 17:07:56.723471 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:08:07 crc kubenswrapper[4764]: I1001 17:08:07.732134 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:08:07 crc kubenswrapper[4764]: E1001 17:08:07.733186 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:08:20 crc kubenswrapper[4764]: I1001 17:08:20.722629 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:08:20 crc kubenswrapper[4764]: E1001 17:08:20.723308 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.493482 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vv8v5"] Oct 01 17:08:28 crc kubenswrapper[4764]: E1001 17:08:28.495925 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f287a53c-85ee-4dba-aecb-53d477c2db09" containerName="extract-utilities" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.495998 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f287a53c-85ee-4dba-aecb-53d477c2db09" containerName="extract-utilities" Oct 01 17:08:28 crc kubenswrapper[4764]: E1001 17:08:28.496020 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f287a53c-85ee-4dba-aecb-53d477c2db09" containerName="extract-content" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.496027 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f287a53c-85ee-4dba-aecb-53d477c2db09" containerName="extract-content" Oct 01 17:08:28 crc kubenswrapper[4764]: E1001 17:08:28.496054 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f287a53c-85ee-4dba-aecb-53d477c2db09" containerName="registry-server" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.498451 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f287a53c-85ee-4dba-aecb-53d477c2db09" containerName="registry-server" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.498767 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f287a53c-85ee-4dba-aecb-53d477c2db09" containerName="registry-server" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.504961 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.519299 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vv8v5"] Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.534345 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twzh9\" (UniqueName: \"kubernetes.io/projected/60c666b6-6a27-4e13-aef6-7438e19d9999-kube-api-access-twzh9\") pod \"redhat-operators-vv8v5\" (UID: \"60c666b6-6a27-4e13-aef6-7438e19d9999\") " pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.534462 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c666b6-6a27-4e13-aef6-7438e19d9999-utilities\") pod \"redhat-operators-vv8v5\" (UID: \"60c666b6-6a27-4e13-aef6-7438e19d9999\") " pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.534617 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c666b6-6a27-4e13-aef6-7438e19d9999-catalog-content\") pod \"redhat-operators-vv8v5\" (UID: \"60c666b6-6a27-4e13-aef6-7438e19d9999\") " pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.636027 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twzh9\" (UniqueName: \"kubernetes.io/projected/60c666b6-6a27-4e13-aef6-7438e19d9999-kube-api-access-twzh9\") pod \"redhat-operators-vv8v5\" (UID: \"60c666b6-6a27-4e13-aef6-7438e19d9999\") " pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.636139 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c666b6-6a27-4e13-aef6-7438e19d9999-utilities\") pod \"redhat-operators-vv8v5\" (UID: \"60c666b6-6a27-4e13-aef6-7438e19d9999\") " pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.636288 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c666b6-6a27-4e13-aef6-7438e19d9999-catalog-content\") pod \"redhat-operators-vv8v5\" (UID: \"60c666b6-6a27-4e13-aef6-7438e19d9999\") " pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.636747 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c666b6-6a27-4e13-aef6-7438e19d9999-catalog-content\") pod \"redhat-operators-vv8v5\" (UID: \"60c666b6-6a27-4e13-aef6-7438e19d9999\") " pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.636807 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c666b6-6a27-4e13-aef6-7438e19d9999-utilities\") pod \"redhat-operators-vv8v5\" (UID: \"60c666b6-6a27-4e13-aef6-7438e19d9999\") " pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.674362 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twzh9\" (UniqueName: \"kubernetes.io/projected/60c666b6-6a27-4e13-aef6-7438e19d9999-kube-api-access-twzh9\") pod \"redhat-operators-vv8v5\" (UID: \"60c666b6-6a27-4e13-aef6-7438e19d9999\") " pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:28 crc kubenswrapper[4764]: I1001 17:08:28.841675 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:29 crc kubenswrapper[4764]: I1001 17:08:29.329706 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vv8v5"] Oct 01 17:08:30 crc kubenswrapper[4764]: I1001 17:08:30.322182 4764 generic.go:334] "Generic (PLEG): container finished" podID="60c666b6-6a27-4e13-aef6-7438e19d9999" containerID="b00dbb6a9fc91ada560d97bcf26f1e7cbf4a09e4d1edef7152d6067815513820" exitCode=0 Oct 01 17:08:30 crc kubenswrapper[4764]: I1001 17:08:30.322239 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vv8v5" event={"ID":"60c666b6-6a27-4e13-aef6-7438e19d9999","Type":"ContainerDied","Data":"b00dbb6a9fc91ada560d97bcf26f1e7cbf4a09e4d1edef7152d6067815513820"} Oct 01 17:08:30 crc kubenswrapper[4764]: I1001 17:08:30.322299 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vv8v5" event={"ID":"60c666b6-6a27-4e13-aef6-7438e19d9999","Type":"ContainerStarted","Data":"33539bb93f62dcf8b608b0ac293d00dc6c8929f4fa9943d0f7ec696933368657"} Oct 01 17:08:30 crc kubenswrapper[4764]: I1001 17:08:30.324873 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 17:08:32 crc kubenswrapper[4764]: I1001 17:08:32.343647 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vv8v5" event={"ID":"60c666b6-6a27-4e13-aef6-7438e19d9999","Type":"ContainerStarted","Data":"3e4755e5d832569056b1476f463a7b45200b8988e7880af027d732d6110e1f78"} Oct 01 17:08:35 crc kubenswrapper[4764]: I1001 17:08:35.724235 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:08:35 crc kubenswrapper[4764]: E1001 17:08:35.725511 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:08:37 crc kubenswrapper[4764]: I1001 17:08:37.390265 4764 generic.go:334] "Generic (PLEG): container finished" podID="60c666b6-6a27-4e13-aef6-7438e19d9999" containerID="3e4755e5d832569056b1476f463a7b45200b8988e7880af027d732d6110e1f78" exitCode=0 Oct 01 17:08:37 crc kubenswrapper[4764]: I1001 17:08:37.390418 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vv8v5" event={"ID":"60c666b6-6a27-4e13-aef6-7438e19d9999","Type":"ContainerDied","Data":"3e4755e5d832569056b1476f463a7b45200b8988e7880af027d732d6110e1f78"} Oct 01 17:08:38 crc kubenswrapper[4764]: I1001 17:08:38.404414 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vv8v5" event={"ID":"60c666b6-6a27-4e13-aef6-7438e19d9999","Type":"ContainerStarted","Data":"9d2e242715fa8ab3ce880de1837617379ab0796d857102c1dc08a363086d816e"} Oct 01 17:08:48 crc kubenswrapper[4764]: I1001 17:08:48.722066 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:08:48 crc kubenswrapper[4764]: E1001 17:08:48.722942 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:08:48 crc kubenswrapper[4764]: I1001 17:08:48.842095 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:48 crc kubenswrapper[4764]: I1001 17:08:48.842159 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:48 crc kubenswrapper[4764]: I1001 17:08:48.892194 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:48 crc kubenswrapper[4764]: I1001 17:08:48.910294 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vv8v5" podStartSLOduration=13.073560841 podStartE2EDuration="20.91027589s" podCreationTimestamp="2025-10-01 17:08:28 +0000 UTC" firstStartedPulling="2025-10-01 17:08:30.324558789 +0000 UTC m=+3973.324205634" lastFinishedPulling="2025-10-01 17:08:38.161273848 +0000 UTC m=+3981.160920683" observedRunningTime="2025-10-01 17:08:39.434461666 +0000 UTC m=+3982.434108511" watchObservedRunningTime="2025-10-01 17:08:48.91027589 +0000 UTC m=+3991.909922725" Oct 01 17:08:49 crc kubenswrapper[4764]: I1001 17:08:49.570810 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:49 crc kubenswrapper[4764]: I1001 17:08:49.614835 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vv8v5"] Oct 01 17:08:51 crc kubenswrapper[4764]: I1001 17:08:51.524564 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vv8v5" podUID="60c666b6-6a27-4e13-aef6-7438e19d9999" containerName="registry-server" containerID="cri-o://9d2e242715fa8ab3ce880de1837617379ab0796d857102c1dc08a363086d816e" gracePeriod=2 Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.009231 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.110498 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c666b6-6a27-4e13-aef6-7438e19d9999-utilities\") pod \"60c666b6-6a27-4e13-aef6-7438e19d9999\" (UID: \"60c666b6-6a27-4e13-aef6-7438e19d9999\") " Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.110587 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twzh9\" (UniqueName: \"kubernetes.io/projected/60c666b6-6a27-4e13-aef6-7438e19d9999-kube-api-access-twzh9\") pod \"60c666b6-6a27-4e13-aef6-7438e19d9999\" (UID: \"60c666b6-6a27-4e13-aef6-7438e19d9999\") " Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.110639 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c666b6-6a27-4e13-aef6-7438e19d9999-catalog-content\") pod \"60c666b6-6a27-4e13-aef6-7438e19d9999\" (UID: \"60c666b6-6a27-4e13-aef6-7438e19d9999\") " Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.111536 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60c666b6-6a27-4e13-aef6-7438e19d9999-utilities" (OuterVolumeSpecName: "utilities") pod "60c666b6-6a27-4e13-aef6-7438e19d9999" (UID: "60c666b6-6a27-4e13-aef6-7438e19d9999"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.116701 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60c666b6-6a27-4e13-aef6-7438e19d9999-kube-api-access-twzh9" (OuterVolumeSpecName: "kube-api-access-twzh9") pod "60c666b6-6a27-4e13-aef6-7438e19d9999" (UID: "60c666b6-6a27-4e13-aef6-7438e19d9999"). InnerVolumeSpecName "kube-api-access-twzh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.203910 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60c666b6-6a27-4e13-aef6-7438e19d9999-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60c666b6-6a27-4e13-aef6-7438e19d9999" (UID: "60c666b6-6a27-4e13-aef6-7438e19d9999"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.212861 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60c666b6-6a27-4e13-aef6-7438e19d9999-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.212908 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twzh9\" (UniqueName: \"kubernetes.io/projected/60c666b6-6a27-4e13-aef6-7438e19d9999-kube-api-access-twzh9\") on node \"crc\" DevicePath \"\"" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.212924 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60c666b6-6a27-4e13-aef6-7438e19d9999-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.534181 4764 generic.go:334] "Generic (PLEG): container finished" podID="60c666b6-6a27-4e13-aef6-7438e19d9999" containerID="9d2e242715fa8ab3ce880de1837617379ab0796d857102c1dc08a363086d816e" exitCode=0 Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.534519 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vv8v5" event={"ID":"60c666b6-6a27-4e13-aef6-7438e19d9999","Type":"ContainerDied","Data":"9d2e242715fa8ab3ce880de1837617379ab0796d857102c1dc08a363086d816e"} Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.534547 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vv8v5" event={"ID":"60c666b6-6a27-4e13-aef6-7438e19d9999","Type":"ContainerDied","Data":"33539bb93f62dcf8b608b0ac293d00dc6c8929f4fa9943d0f7ec696933368657"} Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.534565 4764 scope.go:117] "RemoveContainer" containerID="9d2e242715fa8ab3ce880de1837617379ab0796d857102c1dc08a363086d816e" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.534677 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vv8v5" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.570683 4764 scope.go:117] "RemoveContainer" containerID="3e4755e5d832569056b1476f463a7b45200b8988e7880af027d732d6110e1f78" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.607364 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vv8v5"] Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.613167 4764 scope.go:117] "RemoveContainer" containerID="b00dbb6a9fc91ada560d97bcf26f1e7cbf4a09e4d1edef7152d6067815513820" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.647110 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vv8v5"] Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.657654 4764 scope.go:117] "RemoveContainer" containerID="9d2e242715fa8ab3ce880de1837617379ab0796d857102c1dc08a363086d816e" Oct 01 17:08:52 crc kubenswrapper[4764]: E1001 17:08:52.658122 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d2e242715fa8ab3ce880de1837617379ab0796d857102c1dc08a363086d816e\": container with ID starting with 9d2e242715fa8ab3ce880de1837617379ab0796d857102c1dc08a363086d816e not found: ID does not exist" containerID="9d2e242715fa8ab3ce880de1837617379ab0796d857102c1dc08a363086d816e" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.658175 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d2e242715fa8ab3ce880de1837617379ab0796d857102c1dc08a363086d816e"} err="failed to get container status \"9d2e242715fa8ab3ce880de1837617379ab0796d857102c1dc08a363086d816e\": rpc error: code = NotFound desc = could not find container \"9d2e242715fa8ab3ce880de1837617379ab0796d857102c1dc08a363086d816e\": container with ID starting with 9d2e242715fa8ab3ce880de1837617379ab0796d857102c1dc08a363086d816e not found: ID does not exist" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.658204 4764 scope.go:117] "RemoveContainer" containerID="3e4755e5d832569056b1476f463a7b45200b8988e7880af027d732d6110e1f78" Oct 01 17:08:52 crc kubenswrapper[4764]: E1001 17:08:52.658668 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e4755e5d832569056b1476f463a7b45200b8988e7880af027d732d6110e1f78\": container with ID starting with 3e4755e5d832569056b1476f463a7b45200b8988e7880af027d732d6110e1f78 not found: ID does not exist" containerID="3e4755e5d832569056b1476f463a7b45200b8988e7880af027d732d6110e1f78" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.658769 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4755e5d832569056b1476f463a7b45200b8988e7880af027d732d6110e1f78"} err="failed to get container status \"3e4755e5d832569056b1476f463a7b45200b8988e7880af027d732d6110e1f78\": rpc error: code = NotFound desc = could not find container \"3e4755e5d832569056b1476f463a7b45200b8988e7880af027d732d6110e1f78\": container with ID starting with 3e4755e5d832569056b1476f463a7b45200b8988e7880af027d732d6110e1f78 not found: ID does not exist" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.658831 4764 scope.go:117] "RemoveContainer" containerID="b00dbb6a9fc91ada560d97bcf26f1e7cbf4a09e4d1edef7152d6067815513820" Oct 01 17:08:52 crc kubenswrapper[4764]: E1001 17:08:52.659260 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b00dbb6a9fc91ada560d97bcf26f1e7cbf4a09e4d1edef7152d6067815513820\": container with ID starting with b00dbb6a9fc91ada560d97bcf26f1e7cbf4a09e4d1edef7152d6067815513820 not found: ID does not exist" containerID="b00dbb6a9fc91ada560d97bcf26f1e7cbf4a09e4d1edef7152d6067815513820" Oct 01 17:08:52 crc kubenswrapper[4764]: I1001 17:08:52.659288 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b00dbb6a9fc91ada560d97bcf26f1e7cbf4a09e4d1edef7152d6067815513820"} err="failed to get container status \"b00dbb6a9fc91ada560d97bcf26f1e7cbf4a09e4d1edef7152d6067815513820\": rpc error: code = NotFound desc = could not find container \"b00dbb6a9fc91ada560d97bcf26f1e7cbf4a09e4d1edef7152d6067815513820\": container with ID starting with b00dbb6a9fc91ada560d97bcf26f1e7cbf4a09e4d1edef7152d6067815513820 not found: ID does not exist" Oct 01 17:08:53 crc kubenswrapper[4764]: I1001 17:08:53.732963 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60c666b6-6a27-4e13-aef6-7438e19d9999" path="/var/lib/kubelet/pods/60c666b6-6a27-4e13-aef6-7438e19d9999/volumes" Oct 01 17:08:59 crc kubenswrapper[4764]: I1001 17:08:59.723084 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:08:59 crc kubenswrapper[4764]: E1001 17:08:59.724119 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:09:12 crc kubenswrapper[4764]: I1001 17:09:12.722807 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:09:12 crc kubenswrapper[4764]: E1001 17:09:12.724169 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:09:27 crc kubenswrapper[4764]: I1001 17:09:27.728680 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:09:27 crc kubenswrapper[4764]: E1001 17:09:27.730084 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:09:39 crc kubenswrapper[4764]: I1001 17:09:39.722910 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:09:39 crc kubenswrapper[4764]: E1001 17:09:39.725817 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:09:53 crc kubenswrapper[4764]: I1001 17:09:53.723004 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:09:53 crc kubenswrapper[4764]: E1001 17:09:53.724281 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:10:05 crc kubenswrapper[4764]: I1001 17:10:05.722146 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:10:05 crc kubenswrapper[4764]: E1001 17:10:05.723003 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:10:17 crc kubenswrapper[4764]: I1001 17:10:17.722837 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:10:17 crc kubenswrapper[4764]: E1001 17:10:17.723768 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:10:22 crc kubenswrapper[4764]: I1001 17:10:22.383021 4764 generic.go:334] "Generic (PLEG): container finished" podID="0d43e380-092b-4488-9956-0ca607448dd4" containerID="713d7a7a5766f4149e580247991137a4cb8e892d501673921d7505f926ca11c7" exitCode=0 Oct 01 17:10:22 crc kubenswrapper[4764]: I1001 17:10:22.383115 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0d43e380-092b-4488-9956-0ca607448dd4","Type":"ContainerDied","Data":"713d7a7a5766f4149e580247991137a4cb8e892d501673921d7505f926ca11c7"} Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.783680 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.860107 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xskh6\" (UniqueName: \"kubernetes.io/projected/0d43e380-092b-4488-9956-0ca607448dd4-kube-api-access-xskh6\") pod \"0d43e380-092b-4488-9956-0ca607448dd4\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.860502 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"0d43e380-092b-4488-9956-0ca607448dd4\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.860573 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d43e380-092b-4488-9956-0ca607448dd4-openstack-config\") pod \"0d43e380-092b-4488-9956-0ca607448dd4\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.860643 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-ca-certs\") pod \"0d43e380-092b-4488-9956-0ca607448dd4\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.860669 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0d43e380-092b-4488-9956-0ca607448dd4-test-operator-ephemeral-workdir\") pod \"0d43e380-092b-4488-9956-0ca607448dd4\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.860717 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0d43e380-092b-4488-9956-0ca607448dd4-test-operator-ephemeral-temporary\") pod \"0d43e380-092b-4488-9956-0ca607448dd4\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.860770 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-openstack-config-secret\") pod \"0d43e380-092b-4488-9956-0ca607448dd4\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.860804 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-ssh-key\") pod \"0d43e380-092b-4488-9956-0ca607448dd4\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.860856 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d43e380-092b-4488-9956-0ca607448dd4-config-data\") pod \"0d43e380-092b-4488-9956-0ca607448dd4\" (UID: \"0d43e380-092b-4488-9956-0ca607448dd4\") " Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.861635 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d43e380-092b-4488-9956-0ca607448dd4-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "0d43e380-092b-4488-9956-0ca607448dd4" (UID: "0d43e380-092b-4488-9956-0ca607448dd4"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.861790 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d43e380-092b-4488-9956-0ca607448dd4-config-data" (OuterVolumeSpecName: "config-data") pod "0d43e380-092b-4488-9956-0ca607448dd4" (UID: "0d43e380-092b-4488-9956-0ca607448dd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.862514 4764 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0d43e380-092b-4488-9956-0ca607448dd4-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.862540 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d43e380-092b-4488-9956-0ca607448dd4-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.867804 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d43e380-092b-4488-9956-0ca607448dd4-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "0d43e380-092b-4488-9956-0ca607448dd4" (UID: "0d43e380-092b-4488-9956-0ca607448dd4"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.877996 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "0d43e380-092b-4488-9956-0ca607448dd4" (UID: "0d43e380-092b-4488-9956-0ca607448dd4"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.881033 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d43e380-092b-4488-9956-0ca607448dd4-kube-api-access-xskh6" (OuterVolumeSpecName: "kube-api-access-xskh6") pod "0d43e380-092b-4488-9956-0ca607448dd4" (UID: "0d43e380-092b-4488-9956-0ca607448dd4"). InnerVolumeSpecName "kube-api-access-xskh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.900549 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "0d43e380-092b-4488-9956-0ca607448dd4" (UID: "0d43e380-092b-4488-9956-0ca607448dd4"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.906338 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0d43e380-092b-4488-9956-0ca607448dd4" (UID: "0d43e380-092b-4488-9956-0ca607448dd4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.908780 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0d43e380-092b-4488-9956-0ca607448dd4" (UID: "0d43e380-092b-4488-9956-0ca607448dd4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.929324 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d43e380-092b-4488-9956-0ca607448dd4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0d43e380-092b-4488-9956-0ca607448dd4" (UID: "0d43e380-092b-4488-9956-0ca607448dd4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.964260 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.964312 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xskh6\" (UniqueName: \"kubernetes.io/projected/0d43e380-092b-4488-9956-0ca607448dd4-kube-api-access-xskh6\") on node \"crc\" DevicePath \"\"" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.964357 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.964372 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d43e380-092b-4488-9956-0ca607448dd4-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.964382 4764 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.964393 4764 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0d43e380-092b-4488-9956-0ca607448dd4-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.964404 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d43e380-092b-4488-9956-0ca607448dd4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 17:10:23 crc kubenswrapper[4764]: I1001 17:10:23.986409 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 01 17:10:24 crc kubenswrapper[4764]: I1001 17:10:24.066248 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 01 17:10:24 crc kubenswrapper[4764]: I1001 17:10:24.402743 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0d43e380-092b-4488-9956-0ca607448dd4","Type":"ContainerDied","Data":"7471745d7d9f82d3a4d8298973668e8be0ffead56847a1050afd8c29b1d76d05"} Oct 01 17:10:24 crc kubenswrapper[4764]: I1001 17:10:24.402790 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7471745d7d9f82d3a4d8298973668e8be0ffead56847a1050afd8c29b1d76d05" Oct 01 17:10:24 crc kubenswrapper[4764]: I1001 17:10:24.402862 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 17:10:29 crc kubenswrapper[4764]: I1001 17:10:29.722301 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:10:29 crc kubenswrapper[4764]: E1001 17:10:29.722971 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.452716 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 17:10:36 crc kubenswrapper[4764]: E1001 17:10:36.459217 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c666b6-6a27-4e13-aef6-7438e19d9999" containerName="extract-utilities" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.459234 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c666b6-6a27-4e13-aef6-7438e19d9999" containerName="extract-utilities" Oct 01 17:10:36 crc kubenswrapper[4764]: E1001 17:10:36.459249 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c666b6-6a27-4e13-aef6-7438e19d9999" containerName="extract-content" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.459256 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c666b6-6a27-4e13-aef6-7438e19d9999" containerName="extract-content" Oct 01 17:10:36 crc kubenswrapper[4764]: E1001 17:10:36.459278 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d43e380-092b-4488-9956-0ca607448dd4" containerName="tempest-tests-tempest-tests-runner" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.459284 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d43e380-092b-4488-9956-0ca607448dd4" containerName="tempest-tests-tempest-tests-runner" Oct 01 17:10:36 crc kubenswrapper[4764]: E1001 17:10:36.459301 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c666b6-6a27-4e13-aef6-7438e19d9999" containerName="registry-server" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.459306 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c666b6-6a27-4e13-aef6-7438e19d9999" containerName="registry-server" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.459489 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="60c666b6-6a27-4e13-aef6-7438e19d9999" containerName="registry-server" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.459500 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d43e380-092b-4488-9956-0ca607448dd4" containerName="tempest-tests-tempest-tests-runner" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.460058 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.460137 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.473879 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-r78ls" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.624483 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"28fe40a4-f653-4273-b45c-7b9503d9704f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.624862 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz4r5\" (UniqueName: \"kubernetes.io/projected/28fe40a4-f653-4273-b45c-7b9503d9704f-kube-api-access-xz4r5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"28fe40a4-f653-4273-b45c-7b9503d9704f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.726857 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz4r5\" (UniqueName: \"kubernetes.io/projected/28fe40a4-f653-4273-b45c-7b9503d9704f-kube-api-access-xz4r5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"28fe40a4-f653-4273-b45c-7b9503d9704f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.727220 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"28fe40a4-f653-4273-b45c-7b9503d9704f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.727732 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"28fe40a4-f653-4273-b45c-7b9503d9704f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.750082 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz4r5\" (UniqueName: \"kubernetes.io/projected/28fe40a4-f653-4273-b45c-7b9503d9704f-kube-api-access-xz4r5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"28fe40a4-f653-4273-b45c-7b9503d9704f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.755274 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"28fe40a4-f653-4273-b45c-7b9503d9704f\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 17:10:36 crc kubenswrapper[4764]: I1001 17:10:36.780646 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 17:10:37 crc kubenswrapper[4764]: I1001 17:10:37.229768 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 17:10:37 crc kubenswrapper[4764]: I1001 17:10:37.533068 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"28fe40a4-f653-4273-b45c-7b9503d9704f","Type":"ContainerStarted","Data":"a74c7aec79d63ede0a0993982071110f92ac63045e5d32415db588edb2db640f"} Oct 01 17:10:39 crc kubenswrapper[4764]: I1001 17:10:39.564908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"28fe40a4-f653-4273-b45c-7b9503d9704f","Type":"ContainerStarted","Data":"d3909b652fa826ffbf6bd934f60b47bd982584335038d70a77dcee454a6292bd"} Oct 01 17:10:39 crc kubenswrapper[4764]: I1001 17:10:39.584568 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.202355713 podStartE2EDuration="3.58455307s" podCreationTimestamp="2025-10-01 17:10:36 +0000 UTC" firstStartedPulling="2025-10-01 17:10:37.23453035 +0000 UTC m=+4100.234177185" lastFinishedPulling="2025-10-01 17:10:38.616727707 +0000 UTC m=+4101.616374542" observedRunningTime="2025-10-01 17:10:39.581238429 +0000 UTC m=+4102.580885264" watchObservedRunningTime="2025-10-01 17:10:39.58455307 +0000 UTC m=+4102.584199905" Oct 01 17:10:44 crc kubenswrapper[4764]: I1001 17:10:44.721580 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:10:44 crc kubenswrapper[4764]: E1001 17:10:44.722399 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:10:56 crc kubenswrapper[4764]: I1001 17:10:56.346417 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5rzcc/must-gather-mjwf9"] Oct 01 17:10:56 crc kubenswrapper[4764]: I1001 17:10:56.348878 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/must-gather-mjwf9" Oct 01 17:10:56 crc kubenswrapper[4764]: I1001 17:10:56.357170 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5rzcc/must-gather-mjwf9"] Oct 01 17:10:56 crc kubenswrapper[4764]: I1001 17:10:56.358315 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5rzcc"/"openshift-service-ca.crt" Oct 01 17:10:56 crc kubenswrapper[4764]: I1001 17:10:56.363344 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-5rzcc"/"default-dockercfg-2gtpv" Oct 01 17:10:56 crc kubenswrapper[4764]: I1001 17:10:56.366605 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5rzcc"/"kube-root-ca.crt" Oct 01 17:10:56 crc kubenswrapper[4764]: I1001 17:10:56.382174 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kll8d\" (UniqueName: \"kubernetes.io/projected/c01ddf33-500f-470a-a3e0-43ce226d3d44-kube-api-access-kll8d\") pod \"must-gather-mjwf9\" (UID: \"c01ddf33-500f-470a-a3e0-43ce226d3d44\") " pod="openshift-must-gather-5rzcc/must-gather-mjwf9" Oct 01 17:10:56 crc kubenswrapper[4764]: I1001 17:10:56.382517 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c01ddf33-500f-470a-a3e0-43ce226d3d44-must-gather-output\") pod \"must-gather-mjwf9\" (UID: \"c01ddf33-500f-470a-a3e0-43ce226d3d44\") " pod="openshift-must-gather-5rzcc/must-gather-mjwf9" Oct 01 17:10:56 crc kubenswrapper[4764]: I1001 17:10:56.484306 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c01ddf33-500f-470a-a3e0-43ce226d3d44-must-gather-output\") pod \"must-gather-mjwf9\" (UID: \"c01ddf33-500f-470a-a3e0-43ce226d3d44\") " pod="openshift-must-gather-5rzcc/must-gather-mjwf9" Oct 01 17:10:56 crc kubenswrapper[4764]: I1001 17:10:56.484505 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kll8d\" (UniqueName: \"kubernetes.io/projected/c01ddf33-500f-470a-a3e0-43ce226d3d44-kube-api-access-kll8d\") pod \"must-gather-mjwf9\" (UID: \"c01ddf33-500f-470a-a3e0-43ce226d3d44\") " pod="openshift-must-gather-5rzcc/must-gather-mjwf9" Oct 01 17:10:56 crc kubenswrapper[4764]: I1001 17:10:56.484794 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c01ddf33-500f-470a-a3e0-43ce226d3d44-must-gather-output\") pod \"must-gather-mjwf9\" (UID: \"c01ddf33-500f-470a-a3e0-43ce226d3d44\") " pod="openshift-must-gather-5rzcc/must-gather-mjwf9" Oct 01 17:10:56 crc kubenswrapper[4764]: I1001 17:10:56.503908 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kll8d\" (UniqueName: \"kubernetes.io/projected/c01ddf33-500f-470a-a3e0-43ce226d3d44-kube-api-access-kll8d\") pod \"must-gather-mjwf9\" (UID: \"c01ddf33-500f-470a-a3e0-43ce226d3d44\") " pod="openshift-must-gather-5rzcc/must-gather-mjwf9" Oct 01 17:10:56 crc kubenswrapper[4764]: I1001 17:10:56.667890 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/must-gather-mjwf9" Oct 01 17:10:57 crc kubenswrapper[4764]: I1001 17:10:57.248604 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5rzcc/must-gather-mjwf9"] Oct 01 17:10:57 crc kubenswrapper[4764]: I1001 17:10:57.738376 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5rzcc/must-gather-mjwf9" event={"ID":"c01ddf33-500f-470a-a3e0-43ce226d3d44","Type":"ContainerStarted","Data":"b6b4491e9b722a043974ecf77ef9f0e989fb46b54643d258e985f28cf87a1ec9"} Oct 01 17:10:59 crc kubenswrapper[4764]: I1001 17:10:59.723802 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:10:59 crc kubenswrapper[4764]: E1001 17:10:59.724454 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:11:04 crc kubenswrapper[4764]: I1001 17:11:04.805152 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5rzcc/must-gather-mjwf9" event={"ID":"c01ddf33-500f-470a-a3e0-43ce226d3d44","Type":"ContainerStarted","Data":"be4434c6457726053d9801f5672d1f2546429c98701974317484c1f9e540b34e"} Oct 01 17:11:04 crc kubenswrapper[4764]: I1001 17:11:04.806582 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5rzcc/must-gather-mjwf9" event={"ID":"c01ddf33-500f-470a-a3e0-43ce226d3d44","Type":"ContainerStarted","Data":"490cf79ad54ef018ff21a71ef185500ad7c8a0396ad699e38c33616a69eba250"} Oct 01 17:11:04 crc kubenswrapper[4764]: I1001 17:11:04.823601 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5rzcc/must-gather-mjwf9" podStartSLOduration=2.365505183 podStartE2EDuration="8.823571483s" podCreationTimestamp="2025-10-01 17:10:56 +0000 UTC" firstStartedPulling="2025-10-01 17:10:57.255286077 +0000 UTC m=+4120.254932942" lastFinishedPulling="2025-10-01 17:11:03.713352397 +0000 UTC m=+4126.712999242" observedRunningTime="2025-10-01 17:11:04.820018886 +0000 UTC m=+4127.819665721" watchObservedRunningTime="2025-10-01 17:11:04.823571483 +0000 UTC m=+4127.823218318" Oct 01 17:11:07 crc kubenswrapper[4764]: I1001 17:11:07.921436 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5rzcc/crc-debug-r6fq6"] Oct 01 17:11:07 crc kubenswrapper[4764]: I1001 17:11:07.923132 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" Oct 01 17:11:07 crc kubenswrapper[4764]: I1001 17:11:07.968082 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3a66cde-64d3-4af9-8594-ebfaa6697634-host\") pod \"crc-debug-r6fq6\" (UID: \"e3a66cde-64d3-4af9-8594-ebfaa6697634\") " pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" Oct 01 17:11:07 crc kubenswrapper[4764]: I1001 17:11:07.968159 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz7v9\" (UniqueName: \"kubernetes.io/projected/e3a66cde-64d3-4af9-8594-ebfaa6697634-kube-api-access-zz7v9\") pod \"crc-debug-r6fq6\" (UID: \"e3a66cde-64d3-4af9-8594-ebfaa6697634\") " pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" Oct 01 17:11:08 crc kubenswrapper[4764]: I1001 17:11:08.070683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3a66cde-64d3-4af9-8594-ebfaa6697634-host\") pod \"crc-debug-r6fq6\" (UID: \"e3a66cde-64d3-4af9-8594-ebfaa6697634\") " pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" Oct 01 17:11:08 crc kubenswrapper[4764]: I1001 17:11:08.071061 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz7v9\" (UniqueName: \"kubernetes.io/projected/e3a66cde-64d3-4af9-8594-ebfaa6697634-kube-api-access-zz7v9\") pod \"crc-debug-r6fq6\" (UID: \"e3a66cde-64d3-4af9-8594-ebfaa6697634\") " pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" Oct 01 17:11:08 crc kubenswrapper[4764]: I1001 17:11:08.070870 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3a66cde-64d3-4af9-8594-ebfaa6697634-host\") pod \"crc-debug-r6fq6\" (UID: \"e3a66cde-64d3-4af9-8594-ebfaa6697634\") " pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" Oct 01 17:11:08 crc kubenswrapper[4764]: I1001 17:11:08.096363 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz7v9\" (UniqueName: \"kubernetes.io/projected/e3a66cde-64d3-4af9-8594-ebfaa6697634-kube-api-access-zz7v9\") pod \"crc-debug-r6fq6\" (UID: \"e3a66cde-64d3-4af9-8594-ebfaa6697634\") " pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" Oct 01 17:11:08 crc kubenswrapper[4764]: I1001 17:11:08.240156 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" Oct 01 17:11:08 crc kubenswrapper[4764]: I1001 17:11:08.858622 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" event={"ID":"e3a66cde-64d3-4af9-8594-ebfaa6697634","Type":"ContainerStarted","Data":"8aa9fb37cf206347cf1d52bc320e7c086562df206418decbc5fab37e06d583d6"} Oct 01 17:11:14 crc kubenswrapper[4764]: I1001 17:11:14.721664 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:11:14 crc kubenswrapper[4764]: E1001 17:11:14.722509 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:11:21 crc kubenswrapper[4764]: I1001 17:11:21.032186 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" event={"ID":"e3a66cde-64d3-4af9-8594-ebfaa6697634","Type":"ContainerStarted","Data":"d57a866f94954bb0b49d80cbc108977f4ea9ed4b988c1a13f4834bf3e833f353"} Oct 01 17:11:21 crc kubenswrapper[4764]: I1001 17:11:21.052988 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" podStartSLOduration=2.694412872 podStartE2EDuration="14.052940466s" podCreationTimestamp="2025-10-01 17:11:07 +0000 UTC" firstStartedPulling="2025-10-01 17:11:08.285158212 +0000 UTC m=+4131.284805047" lastFinishedPulling="2025-10-01 17:11:19.643685806 +0000 UTC m=+4142.643332641" observedRunningTime="2025-10-01 17:11:21.045946305 +0000 UTC m=+4144.045593140" watchObservedRunningTime="2025-10-01 17:11:21.052940466 +0000 UTC m=+4144.052587301" Oct 01 17:11:27 crc kubenswrapper[4764]: I1001 17:11:27.721967 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:11:28 crc kubenswrapper[4764]: I1001 17:11:28.091999 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"bb43c1b5b6bf993c96f4515c75fc9c7c8c81d9ee12ec27aa7fc7cbe9f2ccf27b"} Oct 01 17:12:10 crc kubenswrapper[4764]: I1001 17:12:10.314165 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tf9nr"] Oct 01 17:12:10 crc kubenswrapper[4764]: I1001 17:12:10.319864 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:10 crc kubenswrapper[4764]: I1001 17:12:10.331595 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnp2j\" (UniqueName: \"kubernetes.io/projected/d43bbc02-e5d5-4f97-9c08-4e60d286571a-kube-api-access-wnp2j\") pod \"redhat-marketplace-tf9nr\" (UID: \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\") " pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:10 crc kubenswrapper[4764]: I1001 17:12:10.331669 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d43bbc02-e5d5-4f97-9c08-4e60d286571a-utilities\") pod \"redhat-marketplace-tf9nr\" (UID: \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\") " pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:10 crc kubenswrapper[4764]: I1001 17:12:10.331766 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d43bbc02-e5d5-4f97-9c08-4e60d286571a-catalog-content\") pod \"redhat-marketplace-tf9nr\" (UID: \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\") " pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:10 crc kubenswrapper[4764]: I1001 17:12:10.347672 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tf9nr"] Oct 01 17:12:10 crc kubenswrapper[4764]: I1001 17:12:10.433561 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnp2j\" (UniqueName: \"kubernetes.io/projected/d43bbc02-e5d5-4f97-9c08-4e60d286571a-kube-api-access-wnp2j\") pod \"redhat-marketplace-tf9nr\" (UID: \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\") " pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:10 crc kubenswrapper[4764]: I1001 17:12:10.433607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d43bbc02-e5d5-4f97-9c08-4e60d286571a-utilities\") pod \"redhat-marketplace-tf9nr\" (UID: \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\") " pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:10 crc kubenswrapper[4764]: I1001 17:12:10.433652 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d43bbc02-e5d5-4f97-9c08-4e60d286571a-catalog-content\") pod \"redhat-marketplace-tf9nr\" (UID: \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\") " pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:10 crc kubenswrapper[4764]: I1001 17:12:10.434166 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d43bbc02-e5d5-4f97-9c08-4e60d286571a-catalog-content\") pod \"redhat-marketplace-tf9nr\" (UID: \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\") " pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:10 crc kubenswrapper[4764]: I1001 17:12:10.434167 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d43bbc02-e5d5-4f97-9c08-4e60d286571a-utilities\") pod \"redhat-marketplace-tf9nr\" (UID: \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\") " pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:10 crc kubenswrapper[4764]: I1001 17:12:10.455684 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnp2j\" (UniqueName: \"kubernetes.io/projected/d43bbc02-e5d5-4f97-9c08-4e60d286571a-kube-api-access-wnp2j\") pod \"redhat-marketplace-tf9nr\" (UID: \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\") " pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:10 crc kubenswrapper[4764]: I1001 17:12:10.676765 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:11 crc kubenswrapper[4764]: I1001 17:12:11.187305 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tf9nr"] Oct 01 17:12:11 crc kubenswrapper[4764]: I1001 17:12:11.475242 4764 generic.go:334] "Generic (PLEG): container finished" podID="d43bbc02-e5d5-4f97-9c08-4e60d286571a" containerID="2fea1d090ab4c635f958c33646ff018ba85df04dedcbdb9f5d847f8e6bfff263" exitCode=0 Oct 01 17:12:11 crc kubenswrapper[4764]: I1001 17:12:11.475291 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tf9nr" event={"ID":"d43bbc02-e5d5-4f97-9c08-4e60d286571a","Type":"ContainerDied","Data":"2fea1d090ab4c635f958c33646ff018ba85df04dedcbdb9f5d847f8e6bfff263"} Oct 01 17:12:11 crc kubenswrapper[4764]: I1001 17:12:11.475572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tf9nr" event={"ID":"d43bbc02-e5d5-4f97-9c08-4e60d286571a","Type":"ContainerStarted","Data":"ada80aefac7c5f87de22298f07a57ba401f3cca2bd5f54151ca4587148c36f27"} Oct 01 17:12:13 crc kubenswrapper[4764]: I1001 17:12:13.494643 4764 generic.go:334] "Generic (PLEG): container finished" podID="d43bbc02-e5d5-4f97-9c08-4e60d286571a" containerID="96d006699c50fe47a60474e22f8b9962e2e5f8f02e939fdb79c2365d3abeb6a2" exitCode=0 Oct 01 17:12:13 crc kubenswrapper[4764]: I1001 17:12:13.494735 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tf9nr" event={"ID":"d43bbc02-e5d5-4f97-9c08-4e60d286571a","Type":"ContainerDied","Data":"96d006699c50fe47a60474e22f8b9962e2e5f8f02e939fdb79c2365d3abeb6a2"} Oct 01 17:12:15 crc kubenswrapper[4764]: I1001 17:12:15.522563 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tf9nr" event={"ID":"d43bbc02-e5d5-4f97-9c08-4e60d286571a","Type":"ContainerStarted","Data":"1e71aef33925907b8a33d2fea41519b984fbca599d77a41d4f1e43688d65ebe2"} Oct 01 17:12:15 crc kubenswrapper[4764]: I1001 17:12:15.545454 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tf9nr" podStartSLOduration=2.440577515 podStartE2EDuration="5.545434059s" podCreationTimestamp="2025-10-01 17:12:10 +0000 UTC" firstStartedPulling="2025-10-01 17:12:11.477495539 +0000 UTC m=+4194.477142374" lastFinishedPulling="2025-10-01 17:12:14.582352083 +0000 UTC m=+4197.581998918" observedRunningTime="2025-10-01 17:12:15.543989944 +0000 UTC m=+4198.543636779" watchObservedRunningTime="2025-10-01 17:12:15.545434059 +0000 UTC m=+4198.545080904" Oct 01 17:12:17 crc kubenswrapper[4764]: I1001 17:12:17.569551 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6bf96b65c4-djgxs_adede25f-2ef2-4d24-a18b-93865063b49f/barbican-api/0.log" Oct 01 17:12:17 crc kubenswrapper[4764]: I1001 17:12:17.836097 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6bf96b65c4-djgxs_adede25f-2ef2-4d24-a18b-93865063b49f/barbican-api-log/0.log" Oct 01 17:12:18 crc kubenswrapper[4764]: I1001 17:12:18.010209 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-557bf5c9c4-gn9s8_446019eb-78e7-4c76-983b-44a968141080/barbican-keystone-listener/0.log" Oct 01 17:12:18 crc kubenswrapper[4764]: I1001 17:12:18.203598 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-557bf5c9c4-gn9s8_446019eb-78e7-4c76-983b-44a968141080/barbican-keystone-listener-log/0.log" Oct 01 17:12:18 crc kubenswrapper[4764]: I1001 17:12:18.479517 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56949f8bfc-chjrr_f4930a41-989a-4747-b659-f35df5f73bd0/barbican-worker/0.log" Oct 01 17:12:18 crc kubenswrapper[4764]: I1001 17:12:18.538662 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56949f8bfc-chjrr_f4930a41-989a-4747-b659-f35df5f73bd0/barbican-worker-log/0.log" Oct 01 17:12:19 crc kubenswrapper[4764]: I1001 17:12:19.501730 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m_abe7d369-08b8-431b-9b66-3b6056a37e00/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:19 crc kubenswrapper[4764]: I1001 17:12:19.919205 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_51273dda-10be-4519-adeb-992fa5936387/ceilometer-central-agent/0.log" Oct 01 17:12:20 crc kubenswrapper[4764]: I1001 17:12:20.116837 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_51273dda-10be-4519-adeb-992fa5936387/sg-core/0.log" Oct 01 17:12:20 crc kubenswrapper[4764]: I1001 17:12:20.140031 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_51273dda-10be-4519-adeb-992fa5936387/proxy-httpd/0.log" Oct 01 17:12:20 crc kubenswrapper[4764]: I1001 17:12:20.207792 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_51273dda-10be-4519-adeb-992fa5936387/ceilometer-notification-agent/0.log" Oct 01 17:12:20 crc kubenswrapper[4764]: I1001 17:12:20.448763 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq_c539a876-f4e2-41db-aa15-6a54e4ac75c6/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:20 crc kubenswrapper[4764]: I1001 17:12:20.578109 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl_c353fd70-5d43-4e79-9863-9d1c4156df15/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:20 crc kubenswrapper[4764]: I1001 17:12:20.677755 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:20 crc kubenswrapper[4764]: I1001 17:12:20.677806 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:20 crc kubenswrapper[4764]: I1001 17:12:20.735774 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:20 crc kubenswrapper[4764]: I1001 17:12:20.935671 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7934a61c-2af3-4c51-987b-411ee1c7645f/cinder-api-log/0.log" Oct 01 17:12:20 crc kubenswrapper[4764]: I1001 17:12:20.958315 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7934a61c-2af3-4c51-987b-411ee1c7645f/cinder-api/0.log" Oct 01 17:12:21 crc kubenswrapper[4764]: I1001 17:12:21.166106 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_8b47ff15-96b3-49ac-a5e5-1ce1051d53a0/probe/0.log" Oct 01 17:12:21 crc kubenswrapper[4764]: I1001 17:12:21.320948 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_8b47ff15-96b3-49ac-a5e5-1ce1051d53a0/cinder-backup/0.log" Oct 01 17:12:21 crc kubenswrapper[4764]: I1001 17:12:21.414503 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cb0e912d-791f-436a-9e94-1e60281b6654/cinder-scheduler/0.log" Oct 01 17:12:21 crc kubenswrapper[4764]: I1001 17:12:21.638943 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:21 crc kubenswrapper[4764]: I1001 17:12:21.642978 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cb0e912d-791f-436a-9e94-1e60281b6654/probe/0.log" Oct 01 17:12:21 crc kubenswrapper[4764]: I1001 17:12:21.694960 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tf9nr"] Oct 01 17:12:21 crc kubenswrapper[4764]: I1001 17:12:21.720997 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_f4b10553-8ea5-49bf-96cf-c22620f1ced3/cinder-volume/0.log" Oct 01 17:12:21 crc kubenswrapper[4764]: I1001 17:12:21.822903 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_f4b10553-8ea5-49bf-96cf-c22620f1ced3/probe/0.log" Oct 01 17:12:21 crc kubenswrapper[4764]: I1001 17:12:21.945631 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf_4f55b6f9-370f-489f-9bfd-989fbc5cd8b9/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:22 crc kubenswrapper[4764]: I1001 17:12:22.237348 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-57zm8_e02e8e56-086f-4152-accb-b8ffdb55a215/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:22 crc kubenswrapper[4764]: I1001 17:12:22.374294 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-f4vwb_4d7d3e82-e4f3-48e1-99ac-949325fec6cb/init/0.log" Oct 01 17:12:22 crc kubenswrapper[4764]: I1001 17:12:22.486470 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-f4vwb_4d7d3e82-e4f3-48e1-99ac-949325fec6cb/dnsmasq-dns/0.log" Oct 01 17:12:22 crc kubenswrapper[4764]: I1001 17:12:22.497110 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-f4vwb_4d7d3e82-e4f3-48e1-99ac-949325fec6cb/init/0.log" Oct 01 17:12:22 crc kubenswrapper[4764]: I1001 17:12:22.704473 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_48929539-7e51-4e55-bf3f-d168cab2e600/glance-log/0.log" Oct 01 17:12:22 crc kubenswrapper[4764]: I1001 17:12:22.710207 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_48929539-7e51-4e55-bf3f-d168cab2e600/glance-httpd/0.log" Oct 01 17:12:22 crc kubenswrapper[4764]: I1001 17:12:22.940537 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c9860202-23f9-492f-b6b7-fd90d113ad6d/glance-httpd/0.log" Oct 01 17:12:22 crc kubenswrapper[4764]: I1001 17:12:22.951620 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c9860202-23f9-492f-b6b7-fd90d113ad6d/glance-log/0.log" Oct 01 17:12:23 crc kubenswrapper[4764]: I1001 17:12:23.279965 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-554f5d45dd-s9w79_b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d/horizon/0.log" Oct 01 17:12:23 crc kubenswrapper[4764]: I1001 17:12:23.307195 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-554f5d45dd-s9w79_b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d/horizon-log/0.log" Oct 01 17:12:23 crc kubenswrapper[4764]: I1001 17:12:23.604951 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tf9nr" podUID="d43bbc02-e5d5-4f97-9c08-4e60d286571a" containerName="registry-server" containerID="cri-o://1e71aef33925907b8a33d2fea41519b984fbca599d77a41d4f1e43688d65ebe2" gracePeriod=2 Oct 01 17:12:23 crc kubenswrapper[4764]: I1001 17:12:23.655954 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f_c6f12828-d7f8-45a2-932c-b866030ce666/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:23 crc kubenswrapper[4764]: I1001 17:12:23.738259 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-wtrjr_a0273bd3-26f6-44d9-a665-75c9eac2cf98/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.125159 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.208689 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d43bbc02-e5d5-4f97-9c08-4e60d286571a-utilities\") pod \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\" (UID: \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\") " Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.209004 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d43bbc02-e5d5-4f97-9c08-4e60d286571a-catalog-content\") pod \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\" (UID: \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\") " Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.209076 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnp2j\" (UniqueName: \"kubernetes.io/projected/d43bbc02-e5d5-4f97-9c08-4e60d286571a-kube-api-access-wnp2j\") pod \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\" (UID: \"d43bbc02-e5d5-4f97-9c08-4e60d286571a\") " Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.209792 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d43bbc02-e5d5-4f97-9c08-4e60d286571a-utilities" (OuterVolumeSpecName: "utilities") pod "d43bbc02-e5d5-4f97-9c08-4e60d286571a" (UID: "d43bbc02-e5d5-4f97-9c08-4e60d286571a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.225153 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d43bbc02-e5d5-4f97-9c08-4e60d286571a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d43bbc02-e5d5-4f97-9c08-4e60d286571a" (UID: "d43bbc02-e5d5-4f97-9c08-4e60d286571a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.243365 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d43bbc02-e5d5-4f97-9c08-4e60d286571a-kube-api-access-wnp2j" (OuterVolumeSpecName: "kube-api-access-wnp2j") pod "d43bbc02-e5d5-4f97-9c08-4e60d286571a" (UID: "d43bbc02-e5d5-4f97-9c08-4e60d286571a"). InnerVolumeSpecName "kube-api-access-wnp2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.253113 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29322301-kgbcw_c858aedc-be1d-4bd6-8c80-906c5345a7df/keystone-cron/0.log" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.311631 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d43bbc02-e5d5-4f97-9c08-4e60d286571a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.311670 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnp2j\" (UniqueName: \"kubernetes.io/projected/d43bbc02-e5d5-4f97-9c08-4e60d286571a-kube-api-access-wnp2j\") on node \"crc\" DevicePath \"\"" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.311684 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d43bbc02-e5d5-4f97-9c08-4e60d286571a-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.361884 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_a1a83996-de2f-4abe-a075-8c0c2191eb7b/kube-state-metrics/0.log" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.585838 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d_7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.615946 4764 generic.go:334] "Generic (PLEG): container finished" podID="d43bbc02-e5d5-4f97-9c08-4e60d286571a" containerID="1e71aef33925907b8a33d2fea41519b984fbca599d77a41d4f1e43688d65ebe2" exitCode=0 Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.615999 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tf9nr" event={"ID":"d43bbc02-e5d5-4f97-9c08-4e60d286571a","Type":"ContainerDied","Data":"1e71aef33925907b8a33d2fea41519b984fbca599d77a41d4f1e43688d65ebe2"} Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.616030 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tf9nr" event={"ID":"d43bbc02-e5d5-4f97-9c08-4e60d286571a","Type":"ContainerDied","Data":"ada80aefac7c5f87de22298f07a57ba401f3cca2bd5f54151ca4587148c36f27"} Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.616067 4764 scope.go:117] "RemoveContainer" containerID="1e71aef33925907b8a33d2fea41519b984fbca599d77a41d4f1e43688d65ebe2" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.616203 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tf9nr" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.653444 4764 scope.go:117] "RemoveContainer" containerID="96d006699c50fe47a60474e22f8b9962e2e5f8f02e939fdb79c2365d3abeb6a2" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.674584 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tf9nr"] Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.702561 4764 scope.go:117] "RemoveContainer" containerID="2fea1d090ab4c635f958c33646ff018ba85df04dedcbdb9f5d847f8e6bfff263" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.711128 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tf9nr"] Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.752552 4764 scope.go:117] "RemoveContainer" containerID="1e71aef33925907b8a33d2fea41519b984fbca599d77a41d4f1e43688d65ebe2" Oct 01 17:12:24 crc kubenswrapper[4764]: E1001 17:12:24.755809 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e71aef33925907b8a33d2fea41519b984fbca599d77a41d4f1e43688d65ebe2\": container with ID starting with 1e71aef33925907b8a33d2fea41519b984fbca599d77a41d4f1e43688d65ebe2 not found: ID does not exist" containerID="1e71aef33925907b8a33d2fea41519b984fbca599d77a41d4f1e43688d65ebe2" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.755844 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e71aef33925907b8a33d2fea41519b984fbca599d77a41d4f1e43688d65ebe2"} err="failed to get container status \"1e71aef33925907b8a33d2fea41519b984fbca599d77a41d4f1e43688d65ebe2\": rpc error: code = NotFound desc = could not find container \"1e71aef33925907b8a33d2fea41519b984fbca599d77a41d4f1e43688d65ebe2\": container with ID starting with 1e71aef33925907b8a33d2fea41519b984fbca599d77a41d4f1e43688d65ebe2 not found: ID does not exist" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.755879 4764 scope.go:117] "RemoveContainer" containerID="96d006699c50fe47a60474e22f8b9962e2e5f8f02e939fdb79c2365d3abeb6a2" Oct 01 17:12:24 crc kubenswrapper[4764]: E1001 17:12:24.767780 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d006699c50fe47a60474e22f8b9962e2e5f8f02e939fdb79c2365d3abeb6a2\": container with ID starting with 96d006699c50fe47a60474e22f8b9962e2e5f8f02e939fdb79c2365d3abeb6a2 not found: ID does not exist" containerID="96d006699c50fe47a60474e22f8b9962e2e5f8f02e939fdb79c2365d3abeb6a2" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.767822 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d006699c50fe47a60474e22f8b9962e2e5f8f02e939fdb79c2365d3abeb6a2"} err="failed to get container status \"96d006699c50fe47a60474e22f8b9962e2e5f8f02e939fdb79c2365d3abeb6a2\": rpc error: code = NotFound desc = could not find container \"96d006699c50fe47a60474e22f8b9962e2e5f8f02e939fdb79c2365d3abeb6a2\": container with ID starting with 96d006699c50fe47a60474e22f8b9962e2e5f8f02e939fdb79c2365d3abeb6a2 not found: ID does not exist" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.767848 4764 scope.go:117] "RemoveContainer" containerID="2fea1d090ab4c635f958c33646ff018ba85df04dedcbdb9f5d847f8e6bfff263" Oct 01 17:12:24 crc kubenswrapper[4764]: E1001 17:12:24.770237 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fea1d090ab4c635f958c33646ff018ba85df04dedcbdb9f5d847f8e6bfff263\": container with ID starting with 2fea1d090ab4c635f958c33646ff018ba85df04dedcbdb9f5d847f8e6bfff263 not found: ID does not exist" containerID="2fea1d090ab4c635f958c33646ff018ba85df04dedcbdb9f5d847f8e6bfff263" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.770282 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fea1d090ab4c635f958c33646ff018ba85df04dedcbdb9f5d847f8e6bfff263"} err="failed to get container status \"2fea1d090ab4c635f958c33646ff018ba85df04dedcbdb9f5d847f8e6bfff263\": rpc error: code = NotFound desc = could not find container \"2fea1d090ab4c635f958c33646ff018ba85df04dedcbdb9f5d847f8e6bfff263\": container with ID starting with 2fea1d090ab4c635f958c33646ff018ba85df04dedcbdb9f5d847f8e6bfff263 not found: ID does not exist" Oct 01 17:12:24 crc kubenswrapper[4764]: I1001 17:12:24.794395 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-78f7fcb65-9gxk4_3b093936-cdfc-4f2c-a8a6-86820b145b73/keystone-api/0.log" Oct 01 17:12:25 crc kubenswrapper[4764]: I1001 17:12:25.104125 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e/manila-api/0.log" Oct 01 17:12:25 crc kubenswrapper[4764]: I1001 17:12:25.178474 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_0f9e419b-d3d5-4813-a276-aac1f18ef4f4/probe/0.log" Oct 01 17:12:25 crc kubenswrapper[4764]: I1001 17:12:25.302614 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_0f9e419b-d3d5-4813-a276-aac1f18ef4f4/manila-scheduler/0.log" Oct 01 17:12:25 crc kubenswrapper[4764]: I1001 17:12:25.526907 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_ca31e39f-7fcb-4389-882d-cfa2d4491df4/probe/0.log" Oct 01 17:12:25 crc kubenswrapper[4764]: I1001 17:12:25.684733 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_ca31e39f-7fcb-4389-882d-cfa2d4491df4/manila-share/0.log" Oct 01 17:12:25 crc kubenswrapper[4764]: I1001 17:12:25.734227 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d43bbc02-e5d5-4f97-9c08-4e60d286571a" path="/var/lib/kubelet/pods/d43bbc02-e5d5-4f97-9c08-4e60d286571a/volumes" Oct 01 17:12:25 crc kubenswrapper[4764]: I1001 17:12:25.743826 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e/manila-api-log/0.log" Oct 01 17:12:26 crc kubenswrapper[4764]: I1001 17:12:26.667296 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5d577ff6cf-5gk59_acde2ba2-32bc-4d80-aa8d-dd6505c14da3/neutron-httpd/0.log" Oct 01 17:12:26 crc kubenswrapper[4764]: I1001 17:12:26.720517 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5d577ff6cf-5gk59_acde2ba2-32bc-4d80-aa8d-dd6505c14da3/neutron-api/0.log" Oct 01 17:12:26 crc kubenswrapper[4764]: I1001 17:12:26.920634 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw_199ab555-85f7-4168-9e83-a5060e006dc4/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:27 crc kubenswrapper[4764]: I1001 17:12:27.354777 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_445abcb4-96ed-403c-bf18-0c1bc5440182/nova-api-log/0.log" Oct 01 17:12:27 crc kubenswrapper[4764]: I1001 17:12:27.442539 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_445abcb4-96ed-403c-bf18-0c1bc5440182/nova-api-api/0.log" Oct 01 17:12:27 crc kubenswrapper[4764]: I1001 17:12:27.627025 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa/nova-cell0-conductor-conductor/0.log" Oct 01 17:12:27 crc kubenswrapper[4764]: I1001 17:12:27.768106 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_78988164-5797-4cee-a8a9-7f87adeb170a/nova-cell1-conductor-conductor/0.log" Oct 01 17:12:28 crc kubenswrapper[4764]: I1001 17:12:28.383485 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_384d37eb-2732-48d4-b38d-2befbd3d0cce/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 17:12:28 crc kubenswrapper[4764]: I1001 17:12:28.628607 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh_473bdd59-1196-45be-931d-f452ce6bc2fa/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:29 crc kubenswrapper[4764]: I1001 17:12:29.084878 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7faef6b6-44c8-4251-981a-ca6f0eddeda1/nova-metadata-log/0.log" Oct 01 17:12:29 crc kubenswrapper[4764]: I1001 17:12:29.455558 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6e705f16-7e06-46aa-a290-f42760df1c2c/nova-scheduler-scheduler/0.log" Oct 01 17:12:29 crc kubenswrapper[4764]: I1001 17:12:29.907810 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca/mysql-bootstrap/0.log" Oct 01 17:12:29 crc kubenswrapper[4764]: I1001 17:12:29.908488 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca/mysql-bootstrap/0.log" Oct 01 17:12:30 crc kubenswrapper[4764]: I1001 17:12:30.441572 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4acbf2ba-c326-445b-b6f6-11458a1dfb68/mysql-bootstrap/0.log" Oct 01 17:12:30 crc kubenswrapper[4764]: I1001 17:12:30.441775 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca/galera/0.log" Oct 01 17:12:30 crc kubenswrapper[4764]: I1001 17:12:30.715224 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4acbf2ba-c326-445b-b6f6-11458a1dfb68/galera/0.log" Oct 01 17:12:30 crc kubenswrapper[4764]: I1001 17:12:30.832131 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4acbf2ba-c326-445b-b6f6-11458a1dfb68/mysql-bootstrap/0.log" Oct 01 17:12:31 crc kubenswrapper[4764]: I1001 17:12:31.189027 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2d91465c-097e-4579-a5de-df0547d06dbf/openstackclient/0.log" Oct 01 17:12:31 crc kubenswrapper[4764]: I1001 17:12:31.468731 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-r8kfk_80dd0b93-add9-4524-8cef-a32b4250e094/openstack-network-exporter/0.log" Oct 01 17:12:31 crc kubenswrapper[4764]: I1001 17:12:31.684856 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7faef6b6-44c8-4251-981a-ca6f0eddeda1/nova-metadata-metadata/0.log" Oct 01 17:12:31 crc kubenswrapper[4764]: I1001 17:12:31.765795 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cfp4k_da235e7c-70c7-4e8e-bf34-260bfc0cb986/ovsdb-server-init/0.log" Oct 01 17:12:31 crc kubenswrapper[4764]: I1001 17:12:31.882523 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cfp4k_da235e7c-70c7-4e8e-bf34-260bfc0cb986/ovsdb-server-init/0.log" Oct 01 17:12:31 crc kubenswrapper[4764]: I1001 17:12:31.890767 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cfp4k_da235e7c-70c7-4e8e-bf34-260bfc0cb986/ovs-vswitchd/0.log" Oct 01 17:12:32 crc kubenswrapper[4764]: I1001 17:12:32.006400 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cfp4k_da235e7c-70c7-4e8e-bf34-260bfc0cb986/ovsdb-server/0.log" Oct 01 17:12:32 crc kubenswrapper[4764]: I1001 17:12:32.217624 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-wjjhq_79962852-f159-44df-bd50-38928f3df91d/ovn-controller/0.log" Oct 01 17:12:32 crc kubenswrapper[4764]: I1001 17:12:32.448263 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xjmq8_ad66f863-1f1b-40f8-8a3f-464eaf32a344/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:32 crc kubenswrapper[4764]: I1001 17:12:32.516004 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ca14b5f3-e2fc-4fc1-9800-d64209a4c266/openstack-network-exporter/0.log" Oct 01 17:12:32 crc kubenswrapper[4764]: I1001 17:12:32.745301 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ca14b5f3-e2fc-4fc1-9800-d64209a4c266/ovn-northd/0.log" Oct 01 17:12:33 crc kubenswrapper[4764]: I1001 17:12:33.006829 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e870edc1-ed3c-4c16-8c20-cde661ac4ce0/openstack-network-exporter/0.log" Oct 01 17:12:33 crc kubenswrapper[4764]: I1001 17:12:33.222685 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e870edc1-ed3c-4c16-8c20-cde661ac4ce0/ovsdbserver-nb/0.log" Oct 01 17:12:33 crc kubenswrapper[4764]: I1001 17:12:33.318641 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_923cf4a9-e116-4a84-ae06-dec150a649bc/openstack-network-exporter/0.log" Oct 01 17:12:33 crc kubenswrapper[4764]: I1001 17:12:33.437825 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_923cf4a9-e116-4a84-ae06-dec150a649bc/ovsdbserver-sb/0.log" Oct 01 17:12:33 crc kubenswrapper[4764]: I1001 17:12:33.667910 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8695dd9c7b-mwdsh_d43038a4-064b-4ecf-bebf-0f4d6116a839/placement-api/0.log" Oct 01 17:12:33 crc kubenswrapper[4764]: I1001 17:12:33.879253 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8695dd9c7b-mwdsh_d43038a4-064b-4ecf-bebf-0f4d6116a839/placement-log/0.log" Oct 01 17:12:33 crc kubenswrapper[4764]: I1001 17:12:33.949371 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e6c99317-e5aa-4c87-a45a-34e4d14846e4/setup-container/0.log" Oct 01 17:12:34 crc kubenswrapper[4764]: I1001 17:12:34.183024 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e6c99317-e5aa-4c87-a45a-34e4d14846e4/setup-container/0.log" Oct 01 17:12:34 crc kubenswrapper[4764]: I1001 17:12:34.243603 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e6c99317-e5aa-4c87-a45a-34e4d14846e4/rabbitmq/0.log" Oct 01 17:12:34 crc kubenswrapper[4764]: I1001 17:12:34.416686 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_17487462-b952-4428-a875-61732b895017/setup-container/0.log" Oct 01 17:12:34 crc kubenswrapper[4764]: I1001 17:12:34.627345 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_17487462-b952-4428-a875-61732b895017/setup-container/0.log" Oct 01 17:12:34 crc kubenswrapper[4764]: I1001 17:12:34.737776 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_17487462-b952-4428-a875-61732b895017/rabbitmq/0.log" Oct 01 17:12:34 crc kubenswrapper[4764]: I1001 17:12:34.877374 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc_a20f9deb-1422-462b-81d1-89cfef47f81d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:34 crc kubenswrapper[4764]: I1001 17:12:34.965898 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf_839c68a1-2404-4037-8975-58e6b02ba81f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:35 crc kubenswrapper[4764]: I1001 17:12:35.223266 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bj86q_a0fbe741-65d7-464f-b6c6-ecdb60f8bb21/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:35 crc kubenswrapper[4764]: I1001 17:12:35.414738 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5rdtx_56678028-55b3-410f-a642-999c1f035e88/ssh-known-hosts-edpm-deployment/0.log" Oct 01 17:12:35 crc kubenswrapper[4764]: I1001 17:12:35.555308 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0d43e380-092b-4488-9956-0ca607448dd4/tempest-tests-tempest-tests-runner/0.log" Oct 01 17:12:35 crc kubenswrapper[4764]: I1001 17:12:35.748452 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_28fe40a4-f653-4273-b45c-7b9503d9704f/test-operator-logs-container/0.log" Oct 01 17:12:36 crc kubenswrapper[4764]: I1001 17:12:36.278663 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt_0134afb9-9d23-47e6-9d46-6a025c3a3a57/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.282769 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q4vfn"] Oct 01 17:12:41 crc kubenswrapper[4764]: E1001 17:12:41.283679 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d43bbc02-e5d5-4f97-9c08-4e60d286571a" containerName="registry-server" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.283756 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d43bbc02-e5d5-4f97-9c08-4e60d286571a" containerName="registry-server" Oct 01 17:12:41 crc kubenswrapper[4764]: E1001 17:12:41.283801 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d43bbc02-e5d5-4f97-9c08-4e60d286571a" containerName="extract-content" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.283807 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d43bbc02-e5d5-4f97-9c08-4e60d286571a" containerName="extract-content" Oct 01 17:12:41 crc kubenswrapper[4764]: E1001 17:12:41.283820 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d43bbc02-e5d5-4f97-9c08-4e60d286571a" containerName="extract-utilities" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.283826 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d43bbc02-e5d5-4f97-9c08-4e60d286571a" containerName="extract-utilities" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.284023 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d43bbc02-e5d5-4f97-9c08-4e60d286571a" containerName="registry-server" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.286607 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.299858 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q4vfn"] Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.462575 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e993c926-d105-4cb9-911c-150e4777b741-catalog-content\") pod \"community-operators-q4vfn\" (UID: \"e993c926-d105-4cb9-911c-150e4777b741\") " pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.462885 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e993c926-d105-4cb9-911c-150e4777b741-utilities\") pod \"community-operators-q4vfn\" (UID: \"e993c926-d105-4cb9-911c-150e4777b741\") " pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.463018 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxbfz\" (UniqueName: \"kubernetes.io/projected/e993c926-d105-4cb9-911c-150e4777b741-kube-api-access-bxbfz\") pod \"community-operators-q4vfn\" (UID: \"e993c926-d105-4cb9-911c-150e4777b741\") " pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.564652 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e993c926-d105-4cb9-911c-150e4777b741-catalog-content\") pod \"community-operators-q4vfn\" (UID: \"e993c926-d105-4cb9-911c-150e4777b741\") " pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.564737 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e993c926-d105-4cb9-911c-150e4777b741-utilities\") pod \"community-operators-q4vfn\" (UID: \"e993c926-d105-4cb9-911c-150e4777b741\") " pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.564795 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxbfz\" (UniqueName: \"kubernetes.io/projected/e993c926-d105-4cb9-911c-150e4777b741-kube-api-access-bxbfz\") pod \"community-operators-q4vfn\" (UID: \"e993c926-d105-4cb9-911c-150e4777b741\") " pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.565637 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e993c926-d105-4cb9-911c-150e4777b741-catalog-content\") pod \"community-operators-q4vfn\" (UID: \"e993c926-d105-4cb9-911c-150e4777b741\") " pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.565858 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e993c926-d105-4cb9-911c-150e4777b741-utilities\") pod \"community-operators-q4vfn\" (UID: \"e993c926-d105-4cb9-911c-150e4777b741\") " pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.589482 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxbfz\" (UniqueName: \"kubernetes.io/projected/e993c926-d105-4cb9-911c-150e4777b741-kube-api-access-bxbfz\") pod \"community-operators-q4vfn\" (UID: \"e993c926-d105-4cb9-911c-150e4777b741\") " pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:12:41 crc kubenswrapper[4764]: I1001 17:12:41.607080 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:12:42 crc kubenswrapper[4764]: I1001 17:12:42.470704 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q4vfn"] Oct 01 17:12:42 crc kubenswrapper[4764]: I1001 17:12:42.861779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4vfn" event={"ID":"e993c926-d105-4cb9-911c-150e4777b741","Type":"ContainerStarted","Data":"530639602ef0cc195dfd35fac8bfcbcd7d9349c5818d3d5c25f60d1eb66bd7c8"} Oct 01 17:12:42 crc kubenswrapper[4764]: I1001 17:12:42.862150 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4vfn" event={"ID":"e993c926-d105-4cb9-911c-150e4777b741","Type":"ContainerStarted","Data":"c63929cb3c9ed2af6a1b583b2a992b4aa84439aa970bd0e0a7bc12e5efbd6b81"} Oct 01 17:12:43 crc kubenswrapper[4764]: I1001 17:12:43.872488 4764 generic.go:334] "Generic (PLEG): container finished" podID="e993c926-d105-4cb9-911c-150e4777b741" containerID="530639602ef0cc195dfd35fac8bfcbcd7d9349c5818d3d5c25f60d1eb66bd7c8" exitCode=0 Oct 01 17:12:43 crc kubenswrapper[4764]: I1001 17:12:43.872801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4vfn" event={"ID":"e993c926-d105-4cb9-911c-150e4777b741","Type":"ContainerDied","Data":"530639602ef0cc195dfd35fac8bfcbcd7d9349c5818d3d5c25f60d1eb66bd7c8"} Oct 01 17:12:45 crc kubenswrapper[4764]: I1001 17:12:45.902197 4764 generic.go:334] "Generic (PLEG): container finished" podID="e993c926-d105-4cb9-911c-150e4777b741" containerID="c033ae1a8c05414f188c98ee29fae91595d1d3e59225d2766739597614677521" exitCode=0 Oct 01 17:12:45 crc kubenswrapper[4764]: I1001 17:12:45.902850 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4vfn" event={"ID":"e993c926-d105-4cb9-911c-150e4777b741","Type":"ContainerDied","Data":"c033ae1a8c05414f188c98ee29fae91595d1d3e59225d2766739597614677521"} Oct 01 17:12:49 crc kubenswrapper[4764]: I1001 17:12:49.736921 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c346177b-4aeb-43b2-8f86-ce57d0d42c10/memcached/0.log" Oct 01 17:12:53 crc kubenswrapper[4764]: I1001 17:12:53.975851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4vfn" event={"ID":"e993c926-d105-4cb9-911c-150e4777b741","Type":"ContainerStarted","Data":"0cb56a7480832e0cb7f12655b977ee86854c9fcc6b85c30c41eb40b1f2e4ca40"} Oct 01 17:12:54 crc kubenswrapper[4764]: I1001 17:12:54.002920 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q4vfn" podStartSLOduration=4.818473062 podStartE2EDuration="13.002903058s" podCreationTimestamp="2025-10-01 17:12:41 +0000 UTC" firstStartedPulling="2025-10-01 17:12:43.875546539 +0000 UTC m=+4226.875193374" lastFinishedPulling="2025-10-01 17:12:52.059976525 +0000 UTC m=+4235.059623370" observedRunningTime="2025-10-01 17:12:53.995070156 +0000 UTC m=+4236.994716991" watchObservedRunningTime="2025-10-01 17:12:54.002903058 +0000 UTC m=+4237.002549893" Oct 01 17:13:01 crc kubenswrapper[4764]: I1001 17:13:01.607708 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:13:01 crc kubenswrapper[4764]: I1001 17:13:01.608371 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:13:01 crc kubenswrapper[4764]: I1001 17:13:01.675406 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:13:02 crc kubenswrapper[4764]: I1001 17:13:02.114213 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:13:02 crc kubenswrapper[4764]: I1001 17:13:02.173989 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q4vfn"] Oct 01 17:13:04 crc kubenswrapper[4764]: I1001 17:13:04.069838 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q4vfn" podUID="e993c926-d105-4cb9-911c-150e4777b741" containerName="registry-server" containerID="cri-o://0cb56a7480832e0cb7f12655b977ee86854c9fcc6b85c30c41eb40b1f2e4ca40" gracePeriod=2 Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.060428 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.090427 4764 generic.go:334] "Generic (PLEG): container finished" podID="e993c926-d105-4cb9-911c-150e4777b741" containerID="0cb56a7480832e0cb7f12655b977ee86854c9fcc6b85c30c41eb40b1f2e4ca40" exitCode=0 Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.090486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4vfn" event={"ID":"e993c926-d105-4cb9-911c-150e4777b741","Type":"ContainerDied","Data":"0cb56a7480832e0cb7f12655b977ee86854c9fcc6b85c30c41eb40b1f2e4ca40"} Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.090543 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q4vfn" event={"ID":"e993c926-d105-4cb9-911c-150e4777b741","Type":"ContainerDied","Data":"c63929cb3c9ed2af6a1b583b2a992b4aa84439aa970bd0e0a7bc12e5efbd6b81"} Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.090571 4764 scope.go:117] "RemoveContainer" containerID="0cb56a7480832e0cb7f12655b977ee86854c9fcc6b85c30c41eb40b1f2e4ca40" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.090568 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q4vfn" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.129838 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxbfz\" (UniqueName: \"kubernetes.io/projected/e993c926-d105-4cb9-911c-150e4777b741-kube-api-access-bxbfz\") pod \"e993c926-d105-4cb9-911c-150e4777b741\" (UID: \"e993c926-d105-4cb9-911c-150e4777b741\") " Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.129984 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e993c926-d105-4cb9-911c-150e4777b741-catalog-content\") pod \"e993c926-d105-4cb9-911c-150e4777b741\" (UID: \"e993c926-d105-4cb9-911c-150e4777b741\") " Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.130074 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e993c926-d105-4cb9-911c-150e4777b741-utilities\") pod \"e993c926-d105-4cb9-911c-150e4777b741\" (UID: \"e993c926-d105-4cb9-911c-150e4777b741\") " Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.132383 4764 scope.go:117] "RemoveContainer" containerID="c033ae1a8c05414f188c98ee29fae91595d1d3e59225d2766739597614677521" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.138669 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e993c926-d105-4cb9-911c-150e4777b741-utilities" (OuterVolumeSpecName: "utilities") pod "e993c926-d105-4cb9-911c-150e4777b741" (UID: "e993c926-d105-4cb9-911c-150e4777b741"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.162882 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e993c926-d105-4cb9-911c-150e4777b741-kube-api-access-bxbfz" (OuterVolumeSpecName: "kube-api-access-bxbfz") pod "e993c926-d105-4cb9-911c-150e4777b741" (UID: "e993c926-d105-4cb9-911c-150e4777b741"). InnerVolumeSpecName "kube-api-access-bxbfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.186102 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e993c926-d105-4cb9-911c-150e4777b741-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e993c926-d105-4cb9-911c-150e4777b741" (UID: "e993c926-d105-4cb9-911c-150e4777b741"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.213575 4764 scope.go:117] "RemoveContainer" containerID="530639602ef0cc195dfd35fac8bfcbcd7d9349c5818d3d5c25f60d1eb66bd7c8" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.233262 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e993c926-d105-4cb9-911c-150e4777b741-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.233308 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e993c926-d105-4cb9-911c-150e4777b741-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.233323 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxbfz\" (UniqueName: \"kubernetes.io/projected/e993c926-d105-4cb9-911c-150e4777b741-kube-api-access-bxbfz\") on node \"crc\" DevicePath \"\"" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.255388 4764 scope.go:117] "RemoveContainer" containerID="0cb56a7480832e0cb7f12655b977ee86854c9fcc6b85c30c41eb40b1f2e4ca40" Oct 01 17:13:05 crc kubenswrapper[4764]: E1001 17:13:05.256434 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cb56a7480832e0cb7f12655b977ee86854c9fcc6b85c30c41eb40b1f2e4ca40\": container with ID starting with 0cb56a7480832e0cb7f12655b977ee86854c9fcc6b85c30c41eb40b1f2e4ca40 not found: ID does not exist" containerID="0cb56a7480832e0cb7f12655b977ee86854c9fcc6b85c30c41eb40b1f2e4ca40" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.256503 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cb56a7480832e0cb7f12655b977ee86854c9fcc6b85c30c41eb40b1f2e4ca40"} err="failed to get container status \"0cb56a7480832e0cb7f12655b977ee86854c9fcc6b85c30c41eb40b1f2e4ca40\": rpc error: code = NotFound desc = could not find container \"0cb56a7480832e0cb7f12655b977ee86854c9fcc6b85c30c41eb40b1f2e4ca40\": container with ID starting with 0cb56a7480832e0cb7f12655b977ee86854c9fcc6b85c30c41eb40b1f2e4ca40 not found: ID does not exist" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.256546 4764 scope.go:117] "RemoveContainer" containerID="c033ae1a8c05414f188c98ee29fae91595d1d3e59225d2766739597614677521" Oct 01 17:13:05 crc kubenswrapper[4764]: E1001 17:13:05.257097 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c033ae1a8c05414f188c98ee29fae91595d1d3e59225d2766739597614677521\": container with ID starting with c033ae1a8c05414f188c98ee29fae91595d1d3e59225d2766739597614677521 not found: ID does not exist" containerID="c033ae1a8c05414f188c98ee29fae91595d1d3e59225d2766739597614677521" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.257133 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c033ae1a8c05414f188c98ee29fae91595d1d3e59225d2766739597614677521"} err="failed to get container status \"c033ae1a8c05414f188c98ee29fae91595d1d3e59225d2766739597614677521\": rpc error: code = NotFound desc = could not find container \"c033ae1a8c05414f188c98ee29fae91595d1d3e59225d2766739597614677521\": container with ID starting with c033ae1a8c05414f188c98ee29fae91595d1d3e59225d2766739597614677521 not found: ID does not exist" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.257151 4764 scope.go:117] "RemoveContainer" containerID="530639602ef0cc195dfd35fac8bfcbcd7d9349c5818d3d5c25f60d1eb66bd7c8" Oct 01 17:13:05 crc kubenswrapper[4764]: E1001 17:13:05.257820 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"530639602ef0cc195dfd35fac8bfcbcd7d9349c5818d3d5c25f60d1eb66bd7c8\": container with ID starting with 530639602ef0cc195dfd35fac8bfcbcd7d9349c5818d3d5c25f60d1eb66bd7c8 not found: ID does not exist" containerID="530639602ef0cc195dfd35fac8bfcbcd7d9349c5818d3d5c25f60d1eb66bd7c8" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.257867 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"530639602ef0cc195dfd35fac8bfcbcd7d9349c5818d3d5c25f60d1eb66bd7c8"} err="failed to get container status \"530639602ef0cc195dfd35fac8bfcbcd7d9349c5818d3d5c25f60d1eb66bd7c8\": rpc error: code = NotFound desc = could not find container \"530639602ef0cc195dfd35fac8bfcbcd7d9349c5818d3d5c25f60d1eb66bd7c8\": container with ID starting with 530639602ef0cc195dfd35fac8bfcbcd7d9349c5818d3d5c25f60d1eb66bd7c8 not found: ID does not exist" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.442704 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q4vfn"] Oct 01 17:13:05 crc kubenswrapper[4764]: E1001 17:13:05.451773 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode993c926_d105_4cb9_911c_150e4777b741.slice\": RecentStats: unable to find data in memory cache]" Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.453263 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q4vfn"] Oct 01 17:13:05 crc kubenswrapper[4764]: I1001 17:13:05.732998 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e993c926-d105-4cb9-911c-150e4777b741" path="/var/lib/kubelet/pods/e993c926-d105-4cb9-911c-150e4777b741/volumes" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.266001 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l5prp"] Oct 01 17:13:34 crc kubenswrapper[4764]: E1001 17:13:34.267219 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e993c926-d105-4cb9-911c-150e4777b741" containerName="extract-utilities" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.267237 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e993c926-d105-4cb9-911c-150e4777b741" containerName="extract-utilities" Oct 01 17:13:34 crc kubenswrapper[4764]: E1001 17:13:34.267291 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e993c926-d105-4cb9-911c-150e4777b741" containerName="registry-server" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.267300 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e993c926-d105-4cb9-911c-150e4777b741" containerName="registry-server" Oct 01 17:13:34 crc kubenswrapper[4764]: E1001 17:13:34.267311 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e993c926-d105-4cb9-911c-150e4777b741" containerName="extract-content" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.267320 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e993c926-d105-4cb9-911c-150e4777b741" containerName="extract-content" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.267632 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e993c926-d105-4cb9-911c-150e4777b741" containerName="registry-server" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.269942 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.277697 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l5prp"] Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.333929 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d76e1ea5-5301-4411-8d0e-cff6e10b0722-utilities\") pod \"certified-operators-l5prp\" (UID: \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\") " pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.334281 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg679\" (UniqueName: \"kubernetes.io/projected/d76e1ea5-5301-4411-8d0e-cff6e10b0722-kube-api-access-cg679\") pod \"certified-operators-l5prp\" (UID: \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\") " pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.334383 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d76e1ea5-5301-4411-8d0e-cff6e10b0722-catalog-content\") pod \"certified-operators-l5prp\" (UID: \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\") " pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.436610 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d76e1ea5-5301-4411-8d0e-cff6e10b0722-utilities\") pod \"certified-operators-l5prp\" (UID: \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\") " pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.436735 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg679\" (UniqueName: \"kubernetes.io/projected/d76e1ea5-5301-4411-8d0e-cff6e10b0722-kube-api-access-cg679\") pod \"certified-operators-l5prp\" (UID: \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\") " pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.436762 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d76e1ea5-5301-4411-8d0e-cff6e10b0722-catalog-content\") pod \"certified-operators-l5prp\" (UID: \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\") " pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.437246 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d76e1ea5-5301-4411-8d0e-cff6e10b0722-catalog-content\") pod \"certified-operators-l5prp\" (UID: \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\") " pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.437633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d76e1ea5-5301-4411-8d0e-cff6e10b0722-utilities\") pod \"certified-operators-l5prp\" (UID: \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\") " pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.471507 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg679\" (UniqueName: \"kubernetes.io/projected/d76e1ea5-5301-4411-8d0e-cff6e10b0722-kube-api-access-cg679\") pod \"certified-operators-l5prp\" (UID: \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\") " pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:13:34 crc kubenswrapper[4764]: I1001 17:13:34.599671 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:13:35 crc kubenswrapper[4764]: I1001 17:13:35.155661 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l5prp"] Oct 01 17:13:35 crc kubenswrapper[4764]: I1001 17:13:35.390826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5prp" event={"ID":"d76e1ea5-5301-4411-8d0e-cff6e10b0722","Type":"ContainerStarted","Data":"61737eda027b4b787f0d61278f7189788953352391cb1676f98d108cfc733ce7"} Oct 01 17:13:36 crc kubenswrapper[4764]: I1001 17:13:36.402033 4764 generic.go:334] "Generic (PLEG): container finished" podID="d76e1ea5-5301-4411-8d0e-cff6e10b0722" containerID="e92aec5835347a1723ad7714ce5af918ec89c58a71d1c6b9cd78fefb1a26afa0" exitCode=0 Oct 01 17:13:36 crc kubenswrapper[4764]: I1001 17:13:36.402338 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5prp" event={"ID":"d76e1ea5-5301-4411-8d0e-cff6e10b0722","Type":"ContainerDied","Data":"e92aec5835347a1723ad7714ce5af918ec89c58a71d1c6b9cd78fefb1a26afa0"} Oct 01 17:13:36 crc kubenswrapper[4764]: I1001 17:13:36.403955 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 17:13:38 crc kubenswrapper[4764]: I1001 17:13:38.424733 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5prp" event={"ID":"d76e1ea5-5301-4411-8d0e-cff6e10b0722","Type":"ContainerStarted","Data":"6cbe33fbcad60e83471f14d26595c67667305157ee5c2e7732ddfdc8460ecce4"} Oct 01 17:13:45 crc kubenswrapper[4764]: I1001 17:13:45.497375 4764 generic.go:334] "Generic (PLEG): container finished" podID="d76e1ea5-5301-4411-8d0e-cff6e10b0722" containerID="6cbe33fbcad60e83471f14d26595c67667305157ee5c2e7732ddfdc8460ecce4" exitCode=0 Oct 01 17:13:45 crc kubenswrapper[4764]: I1001 17:13:45.497456 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5prp" event={"ID":"d76e1ea5-5301-4411-8d0e-cff6e10b0722","Type":"ContainerDied","Data":"6cbe33fbcad60e83471f14d26595c67667305157ee5c2e7732ddfdc8460ecce4"} Oct 01 17:13:48 crc kubenswrapper[4764]: I1001 17:13:48.530624 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5prp" event={"ID":"d76e1ea5-5301-4411-8d0e-cff6e10b0722","Type":"ContainerStarted","Data":"38091333a1f0b10dd252ec13b9daf996b5899b952296a614eb5369580fe0f5a8"} Oct 01 17:13:48 crc kubenswrapper[4764]: I1001 17:13:48.556295 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l5prp" podStartSLOduration=3.718200668 podStartE2EDuration="14.555920022s" podCreationTimestamp="2025-10-01 17:13:34 +0000 UTC" firstStartedPulling="2025-10-01 17:13:36.403768887 +0000 UTC m=+4279.403415722" lastFinishedPulling="2025-10-01 17:13:47.241488221 +0000 UTC m=+4290.241135076" observedRunningTime="2025-10-01 17:13:48.552113318 +0000 UTC m=+4291.551760183" watchObservedRunningTime="2025-10-01 17:13:48.555920022 +0000 UTC m=+4291.555566857" Oct 01 17:13:51 crc kubenswrapper[4764]: I1001 17:13:51.914411 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:13:51 crc kubenswrapper[4764]: I1001 17:13:51.914871 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 17:13:54 crc kubenswrapper[4764]: I1001 17:13:54.599769 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:13:54 crc kubenswrapper[4764]: I1001 17:13:54.600445 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:13:55 crc kubenswrapper[4764]: I1001 17:13:55.756270 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-l5prp" podUID="d76e1ea5-5301-4411-8d0e-cff6e10b0722" containerName="registry-server" probeResult="failure" output=< Oct 01 17:13:55 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Oct 01 17:13:55 crc kubenswrapper[4764]: > Oct 01 17:14:05 crc kubenswrapper[4764]: I1001 17:14:05.213190 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:14:05 crc kubenswrapper[4764]: I1001 17:14:05.274534 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:14:05 crc kubenswrapper[4764]: I1001 17:14:05.474319 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l5prp"] Oct 01 17:14:06 crc kubenswrapper[4764]: I1001 17:14:06.744002 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l5prp" podUID="d76e1ea5-5301-4411-8d0e-cff6e10b0722" containerName="registry-server" containerID="cri-o://38091333a1f0b10dd252ec13b9daf996b5899b952296a614eb5369580fe0f5a8" gracePeriod=2 Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.191283 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.274531 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg679\" (UniqueName: \"kubernetes.io/projected/d76e1ea5-5301-4411-8d0e-cff6e10b0722-kube-api-access-cg679\") pod \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\" (UID: \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\") " Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.274626 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d76e1ea5-5301-4411-8d0e-cff6e10b0722-utilities\") pod \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\" (UID: \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\") " Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.274838 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d76e1ea5-5301-4411-8d0e-cff6e10b0722-catalog-content\") pod \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\" (UID: \"d76e1ea5-5301-4411-8d0e-cff6e10b0722\") " Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.275415 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d76e1ea5-5301-4411-8d0e-cff6e10b0722-utilities" (OuterVolumeSpecName: "utilities") pod "d76e1ea5-5301-4411-8d0e-cff6e10b0722" (UID: "d76e1ea5-5301-4411-8d0e-cff6e10b0722"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.275633 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d76e1ea5-5301-4411-8d0e-cff6e10b0722-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.280938 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d76e1ea5-5301-4411-8d0e-cff6e10b0722-kube-api-access-cg679" (OuterVolumeSpecName: "kube-api-access-cg679") pod "d76e1ea5-5301-4411-8d0e-cff6e10b0722" (UID: "d76e1ea5-5301-4411-8d0e-cff6e10b0722"). InnerVolumeSpecName "kube-api-access-cg679". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.325717 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d76e1ea5-5301-4411-8d0e-cff6e10b0722-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d76e1ea5-5301-4411-8d0e-cff6e10b0722" (UID: "d76e1ea5-5301-4411-8d0e-cff6e10b0722"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.377343 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg679\" (UniqueName: \"kubernetes.io/projected/d76e1ea5-5301-4411-8d0e-cff6e10b0722-kube-api-access-cg679\") on node \"crc\" DevicePath \"\"" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.377376 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d76e1ea5-5301-4411-8d0e-cff6e10b0722-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.753538 4764 generic.go:334] "Generic (PLEG): container finished" podID="d76e1ea5-5301-4411-8d0e-cff6e10b0722" containerID="38091333a1f0b10dd252ec13b9daf996b5899b952296a614eb5369580fe0f5a8" exitCode=0 Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.753604 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5prp" event={"ID":"d76e1ea5-5301-4411-8d0e-cff6e10b0722","Type":"ContainerDied","Data":"38091333a1f0b10dd252ec13b9daf996b5899b952296a614eb5369580fe0f5a8"} Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.753646 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5prp" event={"ID":"d76e1ea5-5301-4411-8d0e-cff6e10b0722","Type":"ContainerDied","Data":"61737eda027b4b787f0d61278f7189788953352391cb1676f98d108cfc733ce7"} Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.753669 4764 scope.go:117] "RemoveContainer" containerID="38091333a1f0b10dd252ec13b9daf996b5899b952296a614eb5369580fe0f5a8" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.753671 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l5prp" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.783806 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l5prp"] Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.791190 4764 scope.go:117] "RemoveContainer" containerID="6cbe33fbcad60e83471f14d26595c67667305157ee5c2e7732ddfdc8460ecce4" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.791969 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l5prp"] Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.814792 4764 scope.go:117] "RemoveContainer" containerID="e92aec5835347a1723ad7714ce5af918ec89c58a71d1c6b9cd78fefb1a26afa0" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.858887 4764 scope.go:117] "RemoveContainer" containerID="38091333a1f0b10dd252ec13b9daf996b5899b952296a614eb5369580fe0f5a8" Oct 01 17:14:07 crc kubenswrapper[4764]: E1001 17:14:07.859810 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38091333a1f0b10dd252ec13b9daf996b5899b952296a614eb5369580fe0f5a8\": container with ID starting with 38091333a1f0b10dd252ec13b9daf996b5899b952296a614eb5369580fe0f5a8 not found: ID does not exist" containerID="38091333a1f0b10dd252ec13b9daf996b5899b952296a614eb5369580fe0f5a8" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.859849 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38091333a1f0b10dd252ec13b9daf996b5899b952296a614eb5369580fe0f5a8"} err="failed to get container status \"38091333a1f0b10dd252ec13b9daf996b5899b952296a614eb5369580fe0f5a8\": rpc error: code = NotFound desc = could not find container \"38091333a1f0b10dd252ec13b9daf996b5899b952296a614eb5369580fe0f5a8\": container with ID starting with 38091333a1f0b10dd252ec13b9daf996b5899b952296a614eb5369580fe0f5a8 not found: ID does not exist" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.859903 4764 scope.go:117] "RemoveContainer" containerID="6cbe33fbcad60e83471f14d26595c67667305157ee5c2e7732ddfdc8460ecce4" Oct 01 17:14:07 crc kubenswrapper[4764]: E1001 17:14:07.860240 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cbe33fbcad60e83471f14d26595c67667305157ee5c2e7732ddfdc8460ecce4\": container with ID starting with 6cbe33fbcad60e83471f14d26595c67667305157ee5c2e7732ddfdc8460ecce4 not found: ID does not exist" containerID="6cbe33fbcad60e83471f14d26595c67667305157ee5c2e7732ddfdc8460ecce4" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.860287 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbe33fbcad60e83471f14d26595c67667305157ee5c2e7732ddfdc8460ecce4"} err="failed to get container status \"6cbe33fbcad60e83471f14d26595c67667305157ee5c2e7732ddfdc8460ecce4\": rpc error: code = NotFound desc = could not find container \"6cbe33fbcad60e83471f14d26595c67667305157ee5c2e7732ddfdc8460ecce4\": container with ID starting with 6cbe33fbcad60e83471f14d26595c67667305157ee5c2e7732ddfdc8460ecce4 not found: ID does not exist" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.860318 4764 scope.go:117] "RemoveContainer" containerID="e92aec5835347a1723ad7714ce5af918ec89c58a71d1c6b9cd78fefb1a26afa0" Oct 01 17:14:07 crc kubenswrapper[4764]: E1001 17:14:07.860748 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e92aec5835347a1723ad7714ce5af918ec89c58a71d1c6b9cd78fefb1a26afa0\": container with ID starting with e92aec5835347a1723ad7714ce5af918ec89c58a71d1c6b9cd78fefb1a26afa0 not found: ID does not exist" containerID="e92aec5835347a1723ad7714ce5af918ec89c58a71d1c6b9cd78fefb1a26afa0" Oct 01 17:14:07 crc kubenswrapper[4764]: I1001 17:14:07.860779 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e92aec5835347a1723ad7714ce5af918ec89c58a71d1c6b9cd78fefb1a26afa0"} err="failed to get container status \"e92aec5835347a1723ad7714ce5af918ec89c58a71d1c6b9cd78fefb1a26afa0\": rpc error: code = NotFound desc = could not find container \"e92aec5835347a1723ad7714ce5af918ec89c58a71d1c6b9cd78fefb1a26afa0\": container with ID starting with e92aec5835347a1723ad7714ce5af918ec89c58a71d1c6b9cd78fefb1a26afa0 not found: ID does not exist" Oct 01 17:14:09 crc kubenswrapper[4764]: I1001 17:14:09.735657 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d76e1ea5-5301-4411-8d0e-cff6e10b0722" path="/var/lib/kubelet/pods/d76e1ea5-5301-4411-8d0e-cff6e10b0722/volumes" Oct 01 17:14:14 crc kubenswrapper[4764]: I1001 17:14:14.858773 4764 generic.go:334] "Generic (PLEG): container finished" podID="e3a66cde-64d3-4af9-8594-ebfaa6697634" containerID="d57a866f94954bb0b49d80cbc108977f4ea9ed4b988c1a13f4834bf3e833f353" exitCode=0 Oct 01 17:14:14 crc kubenswrapper[4764]: I1001 17:14:14.858856 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" event={"ID":"e3a66cde-64d3-4af9-8594-ebfaa6697634","Type":"ContainerDied","Data":"d57a866f94954bb0b49d80cbc108977f4ea9ed4b988c1a13f4834bf3e833f353"} Oct 01 17:14:15 crc kubenswrapper[4764]: I1001 17:14:15.987864 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" Oct 01 17:14:16 crc kubenswrapper[4764]: I1001 17:14:16.027595 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5rzcc/crc-debug-r6fq6"] Oct 01 17:14:16 crc kubenswrapper[4764]: I1001 17:14:16.040398 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5rzcc/crc-debug-r6fq6"] Oct 01 17:14:16 crc kubenswrapper[4764]: I1001 17:14:16.055412 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz7v9\" (UniqueName: \"kubernetes.io/projected/e3a66cde-64d3-4af9-8594-ebfaa6697634-kube-api-access-zz7v9\") pod \"e3a66cde-64d3-4af9-8594-ebfaa6697634\" (UID: \"e3a66cde-64d3-4af9-8594-ebfaa6697634\") " Oct 01 17:14:16 crc kubenswrapper[4764]: I1001 17:14:16.055654 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3a66cde-64d3-4af9-8594-ebfaa6697634-host\") pod \"e3a66cde-64d3-4af9-8594-ebfaa6697634\" (UID: \"e3a66cde-64d3-4af9-8594-ebfaa6697634\") " Oct 01 17:14:16 crc kubenswrapper[4764]: I1001 17:14:16.055721 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3a66cde-64d3-4af9-8594-ebfaa6697634-host" (OuterVolumeSpecName: "host") pod "e3a66cde-64d3-4af9-8594-ebfaa6697634" (UID: "e3a66cde-64d3-4af9-8594-ebfaa6697634"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 17:14:16 crc kubenswrapper[4764]: I1001 17:14:16.056333 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e3a66cde-64d3-4af9-8594-ebfaa6697634-host\") on node \"crc\" DevicePath \"\"" Oct 01 17:14:16 crc kubenswrapper[4764]: I1001 17:14:16.061788 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a66cde-64d3-4af9-8594-ebfaa6697634-kube-api-access-zz7v9" (OuterVolumeSpecName: "kube-api-access-zz7v9") pod "e3a66cde-64d3-4af9-8594-ebfaa6697634" (UID: "e3a66cde-64d3-4af9-8594-ebfaa6697634"). InnerVolumeSpecName "kube-api-access-zz7v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:14:16 crc kubenswrapper[4764]: I1001 17:14:16.158296 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz7v9\" (UniqueName: \"kubernetes.io/projected/e3a66cde-64d3-4af9-8594-ebfaa6697634-kube-api-access-zz7v9\") on node \"crc\" DevicePath \"\"" Oct 01 17:14:16 crc kubenswrapper[4764]: I1001 17:14:16.882113 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aa9fb37cf206347cf1d52bc320e7c086562df206418decbc5fab37e06d583d6" Oct 01 17:14:16 crc kubenswrapper[4764]: I1001 17:14:16.882143 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/crc-debug-r6fq6" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.180332 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5rzcc/crc-debug-zm22t"] Oct 01 17:14:17 crc kubenswrapper[4764]: E1001 17:14:17.180714 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a66cde-64d3-4af9-8594-ebfaa6697634" containerName="container-00" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.180726 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a66cde-64d3-4af9-8594-ebfaa6697634" containerName="container-00" Oct 01 17:14:17 crc kubenswrapper[4764]: E1001 17:14:17.180769 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d76e1ea5-5301-4411-8d0e-cff6e10b0722" containerName="extract-utilities" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.180778 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d76e1ea5-5301-4411-8d0e-cff6e10b0722" containerName="extract-utilities" Oct 01 17:14:17 crc kubenswrapper[4764]: E1001 17:14:17.180798 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d76e1ea5-5301-4411-8d0e-cff6e10b0722" containerName="extract-content" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.180806 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d76e1ea5-5301-4411-8d0e-cff6e10b0722" containerName="extract-content" Oct 01 17:14:17 crc kubenswrapper[4764]: E1001 17:14:17.180836 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d76e1ea5-5301-4411-8d0e-cff6e10b0722" containerName="registry-server" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.180842 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d76e1ea5-5301-4411-8d0e-cff6e10b0722" containerName="registry-server" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.181020 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d76e1ea5-5301-4411-8d0e-cff6e10b0722" containerName="registry-server" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.181081 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a66cde-64d3-4af9-8594-ebfaa6697634" containerName="container-00" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.181690 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/crc-debug-zm22t" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.280971 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cpmv\" (UniqueName: \"kubernetes.io/projected/d111ce74-d129-47f2-a483-55b0fc8d2696-kube-api-access-4cpmv\") pod \"crc-debug-zm22t\" (UID: \"d111ce74-d129-47f2-a483-55b0fc8d2696\") " pod="openshift-must-gather-5rzcc/crc-debug-zm22t" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.281099 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d111ce74-d129-47f2-a483-55b0fc8d2696-host\") pod \"crc-debug-zm22t\" (UID: \"d111ce74-d129-47f2-a483-55b0fc8d2696\") " pod="openshift-must-gather-5rzcc/crc-debug-zm22t" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.382602 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cpmv\" (UniqueName: \"kubernetes.io/projected/d111ce74-d129-47f2-a483-55b0fc8d2696-kube-api-access-4cpmv\") pod \"crc-debug-zm22t\" (UID: \"d111ce74-d129-47f2-a483-55b0fc8d2696\") " pod="openshift-must-gather-5rzcc/crc-debug-zm22t" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.384511 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d111ce74-d129-47f2-a483-55b0fc8d2696-host\") pod \"crc-debug-zm22t\" (UID: \"d111ce74-d129-47f2-a483-55b0fc8d2696\") " pod="openshift-must-gather-5rzcc/crc-debug-zm22t" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.384609 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d111ce74-d129-47f2-a483-55b0fc8d2696-host\") pod \"crc-debug-zm22t\" (UID: \"d111ce74-d129-47f2-a483-55b0fc8d2696\") " pod="openshift-must-gather-5rzcc/crc-debug-zm22t" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.403902 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cpmv\" (UniqueName: \"kubernetes.io/projected/d111ce74-d129-47f2-a483-55b0fc8d2696-kube-api-access-4cpmv\") pod \"crc-debug-zm22t\" (UID: \"d111ce74-d129-47f2-a483-55b0fc8d2696\") " pod="openshift-must-gather-5rzcc/crc-debug-zm22t" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.498378 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/crc-debug-zm22t" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.746767 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a66cde-64d3-4af9-8594-ebfaa6697634" path="/var/lib/kubelet/pods/e3a66cde-64d3-4af9-8594-ebfaa6697634/volumes" Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.892680 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5rzcc/crc-debug-zm22t" event={"ID":"d111ce74-d129-47f2-a483-55b0fc8d2696","Type":"ContainerStarted","Data":"6fbf80603f25504118c76cfe9a7b047f01700edda9b0fb4a20306a81a6497fc7"} Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.892771 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5rzcc/crc-debug-zm22t" event={"ID":"d111ce74-d129-47f2-a483-55b0fc8d2696","Type":"ContainerStarted","Data":"e4c5d68fe29b8dd293022e2d671e2094c2e47891d6a4d6d633eee36d0d9259a4"} Oct 01 17:14:17 crc kubenswrapper[4764]: I1001 17:14:17.909730 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5rzcc/crc-debug-zm22t" podStartSLOduration=0.90971048 podStartE2EDuration="909.71048ms" podCreationTimestamp="2025-10-01 17:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 17:14:17.905505077 +0000 UTC m=+4320.905151912" watchObservedRunningTime="2025-10-01 17:14:17.90971048 +0000 UTC m=+4320.909357305" Oct 01 17:14:18 crc kubenswrapper[4764]: I1001 17:14:18.904993 4764 generic.go:334] "Generic (PLEG): container finished" podID="d111ce74-d129-47f2-a483-55b0fc8d2696" containerID="6fbf80603f25504118c76cfe9a7b047f01700edda9b0fb4a20306a81a6497fc7" exitCode=0 Oct 01 17:14:18 crc kubenswrapper[4764]: I1001 17:14:18.905090 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5rzcc/crc-debug-zm22t" event={"ID":"d111ce74-d129-47f2-a483-55b0fc8d2696","Type":"ContainerDied","Data":"6fbf80603f25504118c76cfe9a7b047f01700edda9b0fb4a20306a81a6497fc7"} Oct 01 17:14:20 crc kubenswrapper[4764]: I1001 17:14:20.020845 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/crc-debug-zm22t" Oct 01 17:14:20 crc kubenswrapper[4764]: I1001 17:14:20.132277 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d111ce74-d129-47f2-a483-55b0fc8d2696-host\") pod \"d111ce74-d129-47f2-a483-55b0fc8d2696\" (UID: \"d111ce74-d129-47f2-a483-55b0fc8d2696\") " Oct 01 17:14:20 crc kubenswrapper[4764]: I1001 17:14:20.132300 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d111ce74-d129-47f2-a483-55b0fc8d2696-host" (OuterVolumeSpecName: "host") pod "d111ce74-d129-47f2-a483-55b0fc8d2696" (UID: "d111ce74-d129-47f2-a483-55b0fc8d2696"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 17:14:20 crc kubenswrapper[4764]: I1001 17:14:20.132328 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cpmv\" (UniqueName: \"kubernetes.io/projected/d111ce74-d129-47f2-a483-55b0fc8d2696-kube-api-access-4cpmv\") pod \"d111ce74-d129-47f2-a483-55b0fc8d2696\" (UID: \"d111ce74-d129-47f2-a483-55b0fc8d2696\") " Oct 01 17:14:20 crc kubenswrapper[4764]: I1001 17:14:20.133888 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d111ce74-d129-47f2-a483-55b0fc8d2696-host\") on node \"crc\" DevicePath \"\"" Oct 01 17:14:20 crc kubenswrapper[4764]: I1001 17:14:20.139375 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d111ce74-d129-47f2-a483-55b0fc8d2696-kube-api-access-4cpmv" (OuterVolumeSpecName: "kube-api-access-4cpmv") pod "d111ce74-d129-47f2-a483-55b0fc8d2696" (UID: "d111ce74-d129-47f2-a483-55b0fc8d2696"). InnerVolumeSpecName "kube-api-access-4cpmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:14:20 crc kubenswrapper[4764]: I1001 17:14:20.235351 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cpmv\" (UniqueName: \"kubernetes.io/projected/d111ce74-d129-47f2-a483-55b0fc8d2696-kube-api-access-4cpmv\") on node \"crc\" DevicePath \"\"" Oct 01 17:14:20 crc kubenswrapper[4764]: I1001 17:14:20.923538 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5rzcc/crc-debug-zm22t" event={"ID":"d111ce74-d129-47f2-a483-55b0fc8d2696","Type":"ContainerDied","Data":"e4c5d68fe29b8dd293022e2d671e2094c2e47891d6a4d6d633eee36d0d9259a4"} Oct 01 17:14:20 crc kubenswrapper[4764]: I1001 17:14:20.923589 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4c5d68fe29b8dd293022e2d671e2094c2e47891d6a4d6d633eee36d0d9259a4" Oct 01 17:14:20 crc kubenswrapper[4764]: I1001 17:14:20.923599 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/crc-debug-zm22t" Oct 01 17:14:21 crc kubenswrapper[4764]: I1001 17:14:21.913572 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:14:21 crc kubenswrapper[4764]: I1001 17:14:21.914098 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 17:14:27 crc kubenswrapper[4764]: I1001 17:14:27.002963 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5rzcc/crc-debug-zm22t"] Oct 01 17:14:27 crc kubenswrapper[4764]: I1001 17:14:27.014732 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5rzcc/crc-debug-zm22t"] Oct 01 17:14:27 crc kubenswrapper[4764]: I1001 17:14:27.736523 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d111ce74-d129-47f2-a483-55b0fc8d2696" path="/var/lib/kubelet/pods/d111ce74-d129-47f2-a483-55b0fc8d2696/volumes" Oct 01 17:14:28 crc kubenswrapper[4764]: I1001 17:14:28.180682 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5rzcc/crc-debug-564j5"] Oct 01 17:14:28 crc kubenswrapper[4764]: E1001 17:14:28.181060 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d111ce74-d129-47f2-a483-55b0fc8d2696" containerName="container-00" Oct 01 17:14:28 crc kubenswrapper[4764]: I1001 17:14:28.181073 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d111ce74-d129-47f2-a483-55b0fc8d2696" containerName="container-00" Oct 01 17:14:28 crc kubenswrapper[4764]: I1001 17:14:28.181242 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d111ce74-d129-47f2-a483-55b0fc8d2696" containerName="container-00" Oct 01 17:14:28 crc kubenswrapper[4764]: I1001 17:14:28.181871 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/crc-debug-564j5" Oct 01 17:14:28 crc kubenswrapper[4764]: I1001 17:14:28.362576 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a450e464-695e-4ca0-b8c4-4989f3c0d9ef-host\") pod \"crc-debug-564j5\" (UID: \"a450e464-695e-4ca0-b8c4-4989f3c0d9ef\") " pod="openshift-must-gather-5rzcc/crc-debug-564j5" Oct 01 17:14:28 crc kubenswrapper[4764]: I1001 17:14:28.362915 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzdxb\" (UniqueName: \"kubernetes.io/projected/a450e464-695e-4ca0-b8c4-4989f3c0d9ef-kube-api-access-rzdxb\") pod \"crc-debug-564j5\" (UID: \"a450e464-695e-4ca0-b8c4-4989f3c0d9ef\") " pod="openshift-must-gather-5rzcc/crc-debug-564j5" Oct 01 17:14:28 crc kubenswrapper[4764]: I1001 17:14:28.464446 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a450e464-695e-4ca0-b8c4-4989f3c0d9ef-host\") pod \"crc-debug-564j5\" (UID: \"a450e464-695e-4ca0-b8c4-4989f3c0d9ef\") " pod="openshift-must-gather-5rzcc/crc-debug-564j5" Oct 01 17:14:28 crc kubenswrapper[4764]: I1001 17:14:28.464539 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzdxb\" (UniqueName: \"kubernetes.io/projected/a450e464-695e-4ca0-b8c4-4989f3c0d9ef-kube-api-access-rzdxb\") pod \"crc-debug-564j5\" (UID: \"a450e464-695e-4ca0-b8c4-4989f3c0d9ef\") " pod="openshift-must-gather-5rzcc/crc-debug-564j5" Oct 01 17:14:28 crc kubenswrapper[4764]: I1001 17:14:28.464590 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a450e464-695e-4ca0-b8c4-4989f3c0d9ef-host\") pod \"crc-debug-564j5\" (UID: \"a450e464-695e-4ca0-b8c4-4989f3c0d9ef\") " pod="openshift-must-gather-5rzcc/crc-debug-564j5" Oct 01 17:14:28 crc kubenswrapper[4764]: I1001 17:14:28.575907 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzdxb\" (UniqueName: \"kubernetes.io/projected/a450e464-695e-4ca0-b8c4-4989f3c0d9ef-kube-api-access-rzdxb\") pod \"crc-debug-564j5\" (UID: \"a450e464-695e-4ca0-b8c4-4989f3c0d9ef\") " pod="openshift-must-gather-5rzcc/crc-debug-564j5" Oct 01 17:14:28 crc kubenswrapper[4764]: I1001 17:14:28.817772 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/crc-debug-564j5" Oct 01 17:14:29 crc kubenswrapper[4764]: I1001 17:14:29.019486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5rzcc/crc-debug-564j5" event={"ID":"a450e464-695e-4ca0-b8c4-4989f3c0d9ef","Type":"ContainerStarted","Data":"780d6ef3c0ce5a23b8429d157d6abd92a063718ff6756c623f2fb83d360a609e"} Oct 01 17:14:30 crc kubenswrapper[4764]: I1001 17:14:30.029059 4764 generic.go:334] "Generic (PLEG): container finished" podID="a450e464-695e-4ca0-b8c4-4989f3c0d9ef" containerID="ec96395c2b2ad4d6f2028a68b8a79e9d48aa2d1e0185059331105a0f128b6043" exitCode=0 Oct 01 17:14:30 crc kubenswrapper[4764]: I1001 17:14:30.029445 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5rzcc/crc-debug-564j5" event={"ID":"a450e464-695e-4ca0-b8c4-4989f3c0d9ef","Type":"ContainerDied","Data":"ec96395c2b2ad4d6f2028a68b8a79e9d48aa2d1e0185059331105a0f128b6043"} Oct 01 17:14:30 crc kubenswrapper[4764]: I1001 17:14:30.068058 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5rzcc/crc-debug-564j5"] Oct 01 17:14:30 crc kubenswrapper[4764]: I1001 17:14:30.075190 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5rzcc/crc-debug-564j5"] Oct 01 17:14:31 crc kubenswrapper[4764]: I1001 17:14:31.169878 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/crc-debug-564j5" Oct 01 17:14:31 crc kubenswrapper[4764]: I1001 17:14:31.316242 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzdxb\" (UniqueName: \"kubernetes.io/projected/a450e464-695e-4ca0-b8c4-4989f3c0d9ef-kube-api-access-rzdxb\") pod \"a450e464-695e-4ca0-b8c4-4989f3c0d9ef\" (UID: \"a450e464-695e-4ca0-b8c4-4989f3c0d9ef\") " Oct 01 17:14:31 crc kubenswrapper[4764]: I1001 17:14:31.316514 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a450e464-695e-4ca0-b8c4-4989f3c0d9ef-host\") pod \"a450e464-695e-4ca0-b8c4-4989f3c0d9ef\" (UID: \"a450e464-695e-4ca0-b8c4-4989f3c0d9ef\") " Oct 01 17:14:31 crc kubenswrapper[4764]: I1001 17:14:31.316571 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a450e464-695e-4ca0-b8c4-4989f3c0d9ef-host" (OuterVolumeSpecName: "host") pod "a450e464-695e-4ca0-b8c4-4989f3c0d9ef" (UID: "a450e464-695e-4ca0-b8c4-4989f3c0d9ef"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 17:14:31 crc kubenswrapper[4764]: I1001 17:14:31.316931 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a450e464-695e-4ca0-b8c4-4989f3c0d9ef-host\") on node \"crc\" DevicePath \"\"" Oct 01 17:14:31 crc kubenswrapper[4764]: I1001 17:14:31.322288 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a450e464-695e-4ca0-b8c4-4989f3c0d9ef-kube-api-access-rzdxb" (OuterVolumeSpecName: "kube-api-access-rzdxb") pod "a450e464-695e-4ca0-b8c4-4989f3c0d9ef" (UID: "a450e464-695e-4ca0-b8c4-4989f3c0d9ef"). InnerVolumeSpecName "kube-api-access-rzdxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:14:31 crc kubenswrapper[4764]: I1001 17:14:31.419160 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzdxb\" (UniqueName: \"kubernetes.io/projected/a450e464-695e-4ca0-b8c4-4989f3c0d9ef-kube-api-access-rzdxb\") on node \"crc\" DevicePath \"\"" Oct 01 17:14:31 crc kubenswrapper[4764]: I1001 17:14:31.750768 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a450e464-695e-4ca0-b8c4-4989f3c0d9ef" path="/var/lib/kubelet/pods/a450e464-695e-4ca0-b8c4-4989f3c0d9ef/volumes" Oct 01 17:14:31 crc kubenswrapper[4764]: I1001 17:14:31.970126 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/util/0.log" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.048695 4764 scope.go:117] "RemoveContainer" containerID="ec96395c2b2ad4d6f2028a68b8a79e9d48aa2d1e0185059331105a0f128b6043" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.048750 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/crc-debug-564j5" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.116665 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/pull/0.log" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.145645 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/util/0.log" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.168408 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/pull/0.log" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.375917 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/extract/0.log" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.389391 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/pull/0.log" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.416237 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/util/0.log" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.594659 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-4724b_af86c8bd-6b9f-4cf1-8ffc-d441a90f25fd/manager/0.log" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.625031 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-lrzn4_74ebea6a-ca87-4ccb-ab25-3c4899c04d39/kube-rbac-proxy/0.log" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.642124 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-4724b_af86c8bd-6b9f-4cf1-8ffc-d441a90f25fd/kube-rbac-proxy/0.log" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.844368 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-27n67_0fe3c02d-9e92-4628-8d94-6797d56fe480/kube-rbac-proxy/0.log" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.870337 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-27n67_0fe3c02d-9e92-4628-8d94-6797d56fe480/manager/0.log" Oct 01 17:14:32 crc kubenswrapper[4764]: I1001 17:14:32.875196 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-lrzn4_74ebea6a-ca87-4ccb-ab25-3c4899c04d39/manager/0.log" Oct 01 17:14:33 crc kubenswrapper[4764]: I1001 17:14:33.051484 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-vnmvt_0fff9cd4-9690-4a70-a578-0eadbcbb47d6/kube-rbac-proxy/0.log" Oct 01 17:14:33 crc kubenswrapper[4764]: I1001 17:14:33.184178 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-vnmvt_0fff9cd4-9690-4a70-a578-0eadbcbb47d6/manager/0.log" Oct 01 17:14:33 crc kubenswrapper[4764]: I1001 17:14:33.241740 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-9vv9s_9973c37b-d58f-48b0-8c1e-707576e2cb09/kube-rbac-proxy/0.log" Oct 01 17:14:33 crc kubenswrapper[4764]: I1001 17:14:33.309039 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-9vv9s_9973c37b-d58f-48b0-8c1e-707576e2cb09/manager/0.log" Oct 01 17:14:33 crc kubenswrapper[4764]: I1001 17:14:33.372890 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-g528r_323e4260-1016-4601-a8c1-f75641230fdb/kube-rbac-proxy/0.log" Oct 01 17:14:33 crc kubenswrapper[4764]: I1001 17:14:33.483016 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-g528r_323e4260-1016-4601-a8c1-f75641230fdb/manager/0.log" Oct 01 17:14:33 crc kubenswrapper[4764]: I1001 17:14:33.527570 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-bwj7p_8c17b6bf-0c13-491a-977a-95566d56d7c4/kube-rbac-proxy/0.log" Oct 01 17:14:33 crc kubenswrapper[4764]: I1001 17:14:33.720072 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-bwj7p_8c17b6bf-0c13-491a-977a-95566d56d7c4/manager/0.log" Oct 01 17:14:33 crc kubenswrapper[4764]: I1001 17:14:33.743660 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-62xn4_c535869c-c448-4bea-944d-fce55ddd334c/manager/0.log" Oct 01 17:14:33 crc kubenswrapper[4764]: I1001 17:14:33.806247 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-62xn4_c535869c-c448-4bea-944d-fce55ddd334c/kube-rbac-proxy/0.log" Oct 01 17:14:33 crc kubenswrapper[4764]: I1001 17:14:33.913853 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-fg9hc_67c0305d-d391-4892-869d-f5702a69cc45/kube-rbac-proxy/0.log" Oct 01 17:14:34 crc kubenswrapper[4764]: I1001 17:14:34.030340 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-fg9hc_67c0305d-d391-4892-869d-f5702a69cc45/manager/0.log" Oct 01 17:14:34 crc kubenswrapper[4764]: I1001 17:14:34.077520 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b67755477-8xdpz_db117cfb-3e46-4428-93b1-44a66101c57d/kube-rbac-proxy/0.log" Oct 01 17:14:34 crc kubenswrapper[4764]: I1001 17:14:34.161689 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b67755477-8xdpz_db117cfb-3e46-4428-93b1-44a66101c57d/manager/0.log" Oct 01 17:14:34 crc kubenswrapper[4764]: I1001 17:14:34.264506 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-lms99_e4b31c01-ec06-434e-af2a-228a1ee7ec19/kube-rbac-proxy/0.log" Oct 01 17:14:34 crc kubenswrapper[4764]: I1001 17:14:34.306832 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-lms99_e4b31c01-ec06-434e-af2a-228a1ee7ec19/manager/0.log" Oct 01 17:14:34 crc kubenswrapper[4764]: I1001 17:14:34.484873 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-z9988_53b1bb68-341f-4635-8339-ff10c9b08dee/manager/0.log" Oct 01 17:14:34 crc kubenswrapper[4764]: I1001 17:14:34.498946 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-z9988_53b1bb68-341f-4635-8339-ff10c9b08dee/kube-rbac-proxy/0.log" Oct 01 17:14:34 crc kubenswrapper[4764]: I1001 17:14:34.579153 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-59r2r_394ffa3b-3cd6-4deb-a436-624fa75155a2/kube-rbac-proxy/0.log" Oct 01 17:14:34 crc kubenswrapper[4764]: I1001 17:14:34.961807 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-vnlvl_2050b8cd-91c1-4962-b346-bbfa5c4e652e/kube-rbac-proxy/0.log" Oct 01 17:14:34 crc kubenswrapper[4764]: I1001 17:14:34.982471 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-vnlvl_2050b8cd-91c1-4962-b346-bbfa5c4e652e/manager/0.log" Oct 01 17:14:34 crc kubenswrapper[4764]: I1001 17:14:34.994668 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-59r2r_394ffa3b-3cd6-4deb-a436-624fa75155a2/manager/0.log" Oct 01 17:14:35 crc kubenswrapper[4764]: I1001 17:14:35.197840 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6_ba9b6db9-115e-4760-aef3-107976da810e/manager/0.log" Oct 01 17:14:35 crc kubenswrapper[4764]: I1001 17:14:35.208417 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6_ba9b6db9-115e-4760-aef3-107976da810e/kube-rbac-proxy/0.log" Oct 01 17:14:35 crc kubenswrapper[4764]: I1001 17:14:35.301839 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c86467d95-tnl8h_0775c44c-131f-4a9c-89d5-bd724765e310/kube-rbac-proxy/0.log" Oct 01 17:14:35 crc kubenswrapper[4764]: I1001 17:14:35.419111 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-58f6bc99f-xsx4k_79ca6c6a-0b9f-4122-87ff-4eeb56046125/kube-rbac-proxy/0.log" Oct 01 17:14:35 crc kubenswrapper[4764]: I1001 17:14:35.606468 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-58f6bc99f-xsx4k_79ca6c6a-0b9f-4122-87ff-4eeb56046125/operator/0.log" Oct 01 17:14:35 crc kubenswrapper[4764]: I1001 17:14:35.682246 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-275hc_17f0747f-d044-4d4f-bdad-937743cfb537/registry-server/0.log" Oct 01 17:14:35 crc kubenswrapper[4764]: I1001 17:14:35.948135 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-rkr2b_33cb2692-6fcf-4af5-bf43-697a4a740c19/kube-rbac-proxy/0.log" Oct 01 17:14:36 crc kubenswrapper[4764]: I1001 17:14:36.075641 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-jlwbd_d591342f-60e7-48db-9073-b2d6e9fe6992/kube-rbac-proxy/0.log" Oct 01 17:14:36 crc kubenswrapper[4764]: I1001 17:14:36.092963 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-rkr2b_33cb2692-6fcf-4af5-bf43-697a4a740c19/manager/0.log" Oct 01 17:14:36 crc kubenswrapper[4764]: I1001 17:14:36.235086 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-jlwbd_d591342f-60e7-48db-9073-b2d6e9fe6992/manager/0.log" Oct 01 17:14:36 crc kubenswrapper[4764]: I1001 17:14:36.436011 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp_9bb8b56a-c568-4ea4-985e-a80d49b61197/operator/0.log" Oct 01 17:14:36 crc kubenswrapper[4764]: I1001 17:14:36.496922 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-vbgxb_705820be-248f-49fb-9ace-0f333674985a/kube-rbac-proxy/0.log" Oct 01 17:14:36 crc kubenswrapper[4764]: I1001 17:14:36.686094 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-vbgxb_705820be-248f-49fb-9ace-0f333674985a/manager/0.log" Oct 01 17:14:36 crc kubenswrapper[4764]: I1001 17:14:36.712066 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-8wqsk_10d8bbe9-a54f-489a-8fdc-a6acf5b6a46b/kube-rbac-proxy/0.log" Oct 01 17:14:36 crc kubenswrapper[4764]: I1001 17:14:36.861883 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-8wqsk_10d8bbe9-a54f-489a-8fdc-a6acf5b6a46b/manager/0.log" Oct 01 17:14:36 crc kubenswrapper[4764]: I1001 17:14:36.868801 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-q6nwl_e55f7c89-8011-437e-bcbc-b19ae9e25acd/kube-rbac-proxy/0.log" Oct 01 17:14:36 crc kubenswrapper[4764]: I1001 17:14:36.993165 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-q6nwl_e55f7c89-8011-437e-bcbc-b19ae9e25acd/manager/0.log" Oct 01 17:14:37 crc kubenswrapper[4764]: I1001 17:14:37.025715 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c86467d95-tnl8h_0775c44c-131f-4a9c-89d5-bd724765e310/manager/0.log" Oct 01 17:14:37 crc kubenswrapper[4764]: I1001 17:14:37.089363 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-f5nmv_29eab2f9-7ef6-4cc6-9f45-af32a4071a5d/kube-rbac-proxy/0.log" Oct 01 17:14:37 crc kubenswrapper[4764]: I1001 17:14:37.137317 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-f5nmv_29eab2f9-7ef6-4cc6-9f45-af32a4071a5d/manager/0.log" Oct 01 17:14:51 crc kubenswrapper[4764]: I1001 17:14:51.913565 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:14:51 crc kubenswrapper[4764]: I1001 17:14:51.913946 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 17:14:51 crc kubenswrapper[4764]: I1001 17:14:51.913984 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 17:14:51 crc kubenswrapper[4764]: I1001 17:14:51.914899 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb43c1b5b6bf993c96f4515c75fc9c7c8c81d9ee12ec27aa7fc7cbe9f2ccf27b"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 17:14:51 crc kubenswrapper[4764]: I1001 17:14:51.914960 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://bb43c1b5b6bf993c96f4515c75fc9c7c8c81d9ee12ec27aa7fc7cbe9f2ccf27b" gracePeriod=600 Oct 01 17:14:52 crc kubenswrapper[4764]: I1001 17:14:52.225477 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="bb43c1b5b6bf993c96f4515c75fc9c7c8c81d9ee12ec27aa7fc7cbe9f2ccf27b" exitCode=0 Oct 01 17:14:52 crc kubenswrapper[4764]: I1001 17:14:52.225543 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"bb43c1b5b6bf993c96f4515c75fc9c7c8c81d9ee12ec27aa7fc7cbe9f2ccf27b"} Oct 01 17:14:52 crc kubenswrapper[4764]: I1001 17:14:52.226206 4764 scope.go:117] "RemoveContainer" containerID="d597636708197dd07dad1986245b8287358d68e8cc9f8d7bd6ea11269f678e99" Oct 01 17:14:53 crc kubenswrapper[4764]: I1001 17:14:53.236235 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22"} Oct 01 17:14:53 crc kubenswrapper[4764]: I1001 17:14:53.316758 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-lntzx_eccae5ea-5d95-4d65-97cd-9d8ee4db20bc/control-plane-machine-set-operator/0.log" Oct 01 17:14:53 crc kubenswrapper[4764]: I1001 17:14:53.531630 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lt4t4_1e81f7ca-2bc8-4d14-a101-e73361300228/kube-rbac-proxy/0.log" Oct 01 17:14:53 crc kubenswrapper[4764]: I1001 17:14:53.549727 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lt4t4_1e81f7ca-2bc8-4d14-a101-e73361300228/machine-api-operator/0.log" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.172746 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522"] Oct 01 17:15:00 crc kubenswrapper[4764]: E1001 17:15:00.173819 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a450e464-695e-4ca0-b8c4-4989f3c0d9ef" containerName="container-00" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.173834 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a450e464-695e-4ca0-b8c4-4989f3c0d9ef" containerName="container-00" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.174166 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a450e464-695e-4ca0-b8c4-4989f3c0d9ef" containerName="container-00" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.174927 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.178519 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.179196 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.182326 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522"] Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.190180 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20984268-a4cf-4552-b597-6882f56c415e-secret-volume\") pod \"collect-profiles-29322315-hd522\" (UID: \"20984268-a4cf-4552-b597-6882f56c415e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.190217 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20984268-a4cf-4552-b597-6882f56c415e-config-volume\") pod \"collect-profiles-29322315-hd522\" (UID: \"20984268-a4cf-4552-b597-6882f56c415e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.190252 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jq7v\" (UniqueName: \"kubernetes.io/projected/20984268-a4cf-4552-b597-6882f56c415e-kube-api-access-9jq7v\") pod \"collect-profiles-29322315-hd522\" (UID: \"20984268-a4cf-4552-b597-6882f56c415e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.291015 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20984268-a4cf-4552-b597-6882f56c415e-secret-volume\") pod \"collect-profiles-29322315-hd522\" (UID: \"20984268-a4cf-4552-b597-6882f56c415e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.291096 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20984268-a4cf-4552-b597-6882f56c415e-config-volume\") pod \"collect-profiles-29322315-hd522\" (UID: \"20984268-a4cf-4552-b597-6882f56c415e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.291143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jq7v\" (UniqueName: \"kubernetes.io/projected/20984268-a4cf-4552-b597-6882f56c415e-kube-api-access-9jq7v\") pod \"collect-profiles-29322315-hd522\" (UID: \"20984268-a4cf-4552-b597-6882f56c415e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.292405 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20984268-a4cf-4552-b597-6882f56c415e-config-volume\") pod \"collect-profiles-29322315-hd522\" (UID: \"20984268-a4cf-4552-b597-6882f56c415e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.296686 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20984268-a4cf-4552-b597-6882f56c415e-secret-volume\") pod \"collect-profiles-29322315-hd522\" (UID: \"20984268-a4cf-4552-b597-6882f56c415e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.307556 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jq7v\" (UniqueName: \"kubernetes.io/projected/20984268-a4cf-4552-b597-6882f56c415e-kube-api-access-9jq7v\") pod \"collect-profiles-29322315-hd522\" (UID: \"20984268-a4cf-4552-b597-6882f56c415e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.498515 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" Oct 01 17:15:00 crc kubenswrapper[4764]: I1001 17:15:00.930816 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522"] Oct 01 17:15:01 crc kubenswrapper[4764]: I1001 17:15:01.306734 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" event={"ID":"20984268-a4cf-4552-b597-6882f56c415e","Type":"ContainerStarted","Data":"27346181f8722899108f383adf3c22679bfff2f99031a676c7c56d1b002f7d6a"} Oct 01 17:15:01 crc kubenswrapper[4764]: I1001 17:15:01.307003 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" event={"ID":"20984268-a4cf-4552-b597-6882f56c415e","Type":"ContainerStarted","Data":"8867f0b6854f106aee47d2fbd8a337163e0e248c8346cee9eac7b7f3292bdb1d"} Oct 01 17:15:01 crc kubenswrapper[4764]: I1001 17:15:01.328745 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" podStartSLOduration=1.328728425 podStartE2EDuration="1.328728425s" podCreationTimestamp="2025-10-01 17:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 17:15:01.319031218 +0000 UTC m=+4364.318678053" watchObservedRunningTime="2025-10-01 17:15:01.328728425 +0000 UTC m=+4364.328375260" Oct 01 17:15:02 crc kubenswrapper[4764]: I1001 17:15:02.324563 4764 generic.go:334] "Generic (PLEG): container finished" podID="20984268-a4cf-4552-b597-6882f56c415e" containerID="27346181f8722899108f383adf3c22679bfff2f99031a676c7c56d1b002f7d6a" exitCode=0 Oct 01 17:15:02 crc kubenswrapper[4764]: I1001 17:15:02.324872 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" event={"ID":"20984268-a4cf-4552-b597-6882f56c415e","Type":"ContainerDied","Data":"27346181f8722899108f383adf3c22679bfff2f99031a676c7c56d1b002f7d6a"} Oct 01 17:15:03 crc kubenswrapper[4764]: I1001 17:15:03.722871 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" Oct 01 17:15:03 crc kubenswrapper[4764]: I1001 17:15:03.859566 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20984268-a4cf-4552-b597-6882f56c415e-secret-volume\") pod \"20984268-a4cf-4552-b597-6882f56c415e\" (UID: \"20984268-a4cf-4552-b597-6882f56c415e\") " Oct 01 17:15:03 crc kubenswrapper[4764]: I1001 17:15:03.859652 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jq7v\" (UniqueName: \"kubernetes.io/projected/20984268-a4cf-4552-b597-6882f56c415e-kube-api-access-9jq7v\") pod \"20984268-a4cf-4552-b597-6882f56c415e\" (UID: \"20984268-a4cf-4552-b597-6882f56c415e\") " Oct 01 17:15:03 crc kubenswrapper[4764]: I1001 17:15:03.859775 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20984268-a4cf-4552-b597-6882f56c415e-config-volume\") pod \"20984268-a4cf-4552-b597-6882f56c415e\" (UID: \"20984268-a4cf-4552-b597-6882f56c415e\") " Oct 01 17:15:03 crc kubenswrapper[4764]: I1001 17:15:03.860858 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20984268-a4cf-4552-b597-6882f56c415e-config-volume" (OuterVolumeSpecName: "config-volume") pod "20984268-a4cf-4552-b597-6882f56c415e" (UID: "20984268-a4cf-4552-b597-6882f56c415e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 17:15:03 crc kubenswrapper[4764]: I1001 17:15:03.861459 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20984268-a4cf-4552-b597-6882f56c415e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 17:15:03 crc kubenswrapper[4764]: I1001 17:15:03.867337 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20984268-a4cf-4552-b597-6882f56c415e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "20984268-a4cf-4552-b597-6882f56c415e" (UID: "20984268-a4cf-4552-b597-6882f56c415e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 17:15:03 crc kubenswrapper[4764]: I1001 17:15:03.870432 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20984268-a4cf-4552-b597-6882f56c415e-kube-api-access-9jq7v" (OuterVolumeSpecName: "kube-api-access-9jq7v") pod "20984268-a4cf-4552-b597-6882f56c415e" (UID: "20984268-a4cf-4552-b597-6882f56c415e"). InnerVolumeSpecName "kube-api-access-9jq7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:15:03 crc kubenswrapper[4764]: I1001 17:15:03.963459 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jq7v\" (UniqueName: \"kubernetes.io/projected/20984268-a4cf-4552-b597-6882f56c415e-kube-api-access-9jq7v\") on node \"crc\" DevicePath \"\"" Oct 01 17:15:03 crc kubenswrapper[4764]: I1001 17:15:03.963494 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20984268-a4cf-4552-b597-6882f56c415e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 17:15:04 crc kubenswrapper[4764]: I1001 17:15:04.354835 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" event={"ID":"20984268-a4cf-4552-b597-6882f56c415e","Type":"ContainerDied","Data":"8867f0b6854f106aee47d2fbd8a337163e0e248c8346cee9eac7b7f3292bdb1d"} Oct 01 17:15:04 crc kubenswrapper[4764]: I1001 17:15:04.355139 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8867f0b6854f106aee47d2fbd8a337163e0e248c8346cee9eac7b7f3292bdb1d" Oct 01 17:15:04 crc kubenswrapper[4764]: I1001 17:15:04.355204 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322315-hd522" Oct 01 17:15:04 crc kubenswrapper[4764]: I1001 17:15:04.399003 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496"] Oct 01 17:15:04 crc kubenswrapper[4764]: I1001 17:15:04.410108 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322270-rh496"] Oct 01 17:15:05 crc kubenswrapper[4764]: I1001 17:15:05.736000 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ee5fea-bbee-4155-a586-fbc9af7e8608" path="/var/lib/kubelet/pods/63ee5fea-bbee-4155-a586-fbc9af7e8608/volumes" Oct 01 17:15:06 crc kubenswrapper[4764]: I1001 17:15:06.989243 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-xl9g4_b22968b8-7419-48e2-9fab-a54611dcecad/cert-manager-controller/0.log" Oct 01 17:15:07 crc kubenswrapper[4764]: I1001 17:15:07.036767 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-q78nt_0e72f60f-a975-4370-877f-5d5ba3c7c0b3/cert-manager-cainjector/0.log" Oct 01 17:15:07 crc kubenswrapper[4764]: I1001 17:15:07.233426 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-pxbtx_9eb7dd6e-2f03-4db5-9564-d87513d69d6b/cert-manager-webhook/0.log" Oct 01 17:15:19 crc kubenswrapper[4764]: I1001 17:15:19.186463 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-snhkz_b82401c8-962a-451a-954e-2f603fe91129/nmstate-handler/0.log" Oct 01 17:15:19 crc kubenswrapper[4764]: I1001 17:15:19.224748 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-t66ns_5b55cc57-324e-4f77-aa9f-d655abd399b4/nmstate-console-plugin/0.log" Oct 01 17:15:19 crc kubenswrapper[4764]: I1001 17:15:19.380301 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-dg2fb_e8f3e872-b4e9-4b58-95aa-f63812824933/kube-rbac-proxy/0.log" Oct 01 17:15:19 crc kubenswrapper[4764]: I1001 17:15:19.381386 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-dg2fb_e8f3e872-b4e9-4b58-95aa-f63812824933/nmstate-metrics/0.log" Oct 01 17:15:19 crc kubenswrapper[4764]: I1001 17:15:19.566240 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-lfzc4_dd999924-e56c-45fa-8214-cec275174611/nmstate-operator/0.log" Oct 01 17:15:19 crc kubenswrapper[4764]: I1001 17:15:19.606141 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-bmnmm_2f2de87a-678a-4de3-8fec-9b695f301201/nmstate-webhook/0.log" Oct 01 17:15:32 crc kubenswrapper[4764]: I1001 17:15:32.669793 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-mxkb8_1e96a209-b683-4060-8c9f-b7b1ae8c89b0/kube-rbac-proxy/0.log" Oct 01 17:15:32 crc kubenswrapper[4764]: I1001 17:15:32.764501 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-mxkb8_1e96a209-b683-4060-8c9f-b7b1ae8c89b0/controller/0.log" Oct 01 17:15:32 crc kubenswrapper[4764]: I1001 17:15:32.882323 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-frr-files/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.057258 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-reloader/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.065547 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-metrics/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.095750 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-reloader/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.153169 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-frr-files/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.247680 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-reloader/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.281449 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-frr-files/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.306651 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-metrics/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.315067 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-metrics/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.503412 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-frr-files/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.532103 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-metrics/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.546994 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-reloader/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.562704 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/controller/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.746725 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/kube-rbac-proxy/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.752820 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/frr-metrics/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.760860 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/kube-rbac-proxy-frr/0.log" Oct 01 17:15:33 crc kubenswrapper[4764]: I1001 17:15:33.960068 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/reloader/0.log" Oct 01 17:15:34 crc kubenswrapper[4764]: I1001 17:15:34.015116 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-bhdff_247af8f1-5e4b-4e17-9d31-055bdce2a1d6/frr-k8s-webhook-server/0.log" Oct 01 17:15:34 crc kubenswrapper[4764]: I1001 17:15:34.202447 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-845477fbc7-z8qk5_3b485840-2253-4eb1-888b-5e16d76a3a3d/manager/0.log" Oct 01 17:15:34 crc kubenswrapper[4764]: I1001 17:15:34.236440 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/frr/0.log" Oct 01 17:15:34 crc kubenswrapper[4764]: I1001 17:15:34.357922 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-755d79df97-5b9ck_54e1ea89-8cd7-4a1c-a1b9-e20f321198f7/webhook-server/0.log" Oct 01 17:15:34 crc kubenswrapper[4764]: I1001 17:15:34.471656 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zm88x_0bf73b76-19c9-4264-abe9-80d24dcf6ee6/kube-rbac-proxy/0.log" Oct 01 17:15:34 crc kubenswrapper[4764]: I1001 17:15:34.766748 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zm88x_0bf73b76-19c9-4264-abe9-80d24dcf6ee6/speaker/0.log" Oct 01 17:15:47 crc kubenswrapper[4764]: I1001 17:15:47.256385 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/util/0.log" Oct 01 17:15:47 crc kubenswrapper[4764]: I1001 17:15:47.449873 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/util/0.log" Oct 01 17:15:47 crc kubenswrapper[4764]: I1001 17:15:47.465443 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/pull/0.log" Oct 01 17:15:47 crc kubenswrapper[4764]: I1001 17:15:47.465656 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/pull/0.log" Oct 01 17:15:47 crc kubenswrapper[4764]: I1001 17:15:47.620354 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/pull/0.log" Oct 01 17:15:47 crc kubenswrapper[4764]: I1001 17:15:47.622864 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/util/0.log" Oct 01 17:15:47 crc kubenswrapper[4764]: I1001 17:15:47.638038 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/extract/0.log" Oct 01 17:15:47 crc kubenswrapper[4764]: I1001 17:15:47.792911 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/util/0.log" Oct 01 17:15:48 crc kubenswrapper[4764]: I1001 17:15:48.492217 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/util/0.log" Oct 01 17:15:48 crc kubenswrapper[4764]: I1001 17:15:48.503087 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/pull/0.log" Oct 01 17:15:48 crc kubenswrapper[4764]: I1001 17:15:48.538762 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/pull/0.log" Oct 01 17:15:48 crc kubenswrapper[4764]: I1001 17:15:48.698775 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/util/0.log" Oct 01 17:15:48 crc kubenswrapper[4764]: I1001 17:15:48.709956 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/pull/0.log" Oct 01 17:15:48 crc kubenswrapper[4764]: I1001 17:15:48.751462 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/extract/0.log" Oct 01 17:15:48 crc kubenswrapper[4764]: I1001 17:15:48.918654 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/extract-utilities/0.log" Oct 01 17:15:49 crc kubenswrapper[4764]: I1001 17:15:49.038729 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/extract-utilities/0.log" Oct 01 17:15:49 crc kubenswrapper[4764]: I1001 17:15:49.076433 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/extract-content/0.log" Oct 01 17:15:49 crc kubenswrapper[4764]: I1001 17:15:49.087253 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/extract-content/0.log" Oct 01 17:15:49 crc kubenswrapper[4764]: I1001 17:15:49.204674 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/extract-utilities/0.log" Oct 01 17:15:49 crc kubenswrapper[4764]: I1001 17:15:49.224759 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/extract-content/0.log" Oct 01 17:15:49 crc kubenswrapper[4764]: I1001 17:15:49.435916 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/extract-utilities/0.log" Oct 01 17:15:49 crc kubenswrapper[4764]: I1001 17:15:49.759953 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/registry-server/0.log" Oct 01 17:15:50 crc kubenswrapper[4764]: I1001 17:15:50.145103 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/extract-utilities/0.log" Oct 01 17:15:50 crc kubenswrapper[4764]: I1001 17:15:50.152407 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/extract-content/0.log" Oct 01 17:15:50 crc kubenswrapper[4764]: I1001 17:15:50.176667 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/extract-content/0.log" Oct 01 17:15:50 crc kubenswrapper[4764]: I1001 17:15:50.363992 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/extract-utilities/0.log" Oct 01 17:15:50 crc kubenswrapper[4764]: I1001 17:15:50.425624 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/extract-content/0.log" Oct 01 17:15:50 crc kubenswrapper[4764]: I1001 17:15:50.574648 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/util/0.log" Oct 01 17:15:50 crc kubenswrapper[4764]: I1001 17:15:50.791488 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/util/0.log" Oct 01 17:15:50 crc kubenswrapper[4764]: I1001 17:15:50.802110 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/pull/0.log" Oct 01 17:15:50 crc kubenswrapper[4764]: I1001 17:15:50.808853 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/pull/0.log" Oct 01 17:15:51 crc kubenswrapper[4764]: I1001 17:15:51.053147 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/util/0.log" Oct 01 17:15:51 crc kubenswrapper[4764]: I1001 17:15:51.053874 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/extract/0.log" Oct 01 17:15:51 crc kubenswrapper[4764]: I1001 17:15:51.054226 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/pull/0.log" Oct 01 17:15:51 crc kubenswrapper[4764]: I1001 17:15:51.260737 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/util/0.log" Oct 01 17:15:51 crc kubenswrapper[4764]: I1001 17:15:51.311184 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/registry-server/0.log" Oct 01 17:15:51 crc kubenswrapper[4764]: I1001 17:15:51.426376 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/util/0.log" Oct 01 17:15:51 crc kubenswrapper[4764]: I1001 17:15:51.460539 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/pull/0.log" Oct 01 17:15:51 crc kubenswrapper[4764]: I1001 17:15:51.468682 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/pull/0.log" Oct 01 17:15:51 crc kubenswrapper[4764]: I1001 17:15:51.644950 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/extract/0.log" Oct 01 17:15:51 crc kubenswrapper[4764]: I1001 17:15:51.664372 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xdv29_176eab78-eb2e-4612-994f-d13d95e6c80d/marketplace-operator/0.log" Oct 01 17:15:51 crc kubenswrapper[4764]: I1001 17:15:51.670574 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/util/0.log" Oct 01 17:15:51 crc kubenswrapper[4764]: I1001 17:15:51.674234 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/pull/0.log" Oct 01 17:15:51 crc kubenswrapper[4764]: I1001 17:15:51.857392 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/extract-utilities/0.log" Oct 01 17:15:52 crc kubenswrapper[4764]: I1001 17:15:52.013701 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/extract-utilities/0.log" Oct 01 17:15:52 crc kubenswrapper[4764]: I1001 17:15:52.029246 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/extract-content/0.log" Oct 01 17:15:52 crc kubenswrapper[4764]: I1001 17:15:52.037071 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/extract-content/0.log" Oct 01 17:15:52 crc kubenswrapper[4764]: I1001 17:15:52.194597 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/extract-utilities/0.log" Oct 01 17:15:52 crc kubenswrapper[4764]: I1001 17:15:52.214920 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/extract-content/0.log" Oct 01 17:15:52 crc kubenswrapper[4764]: I1001 17:15:52.260014 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6b8k2_a1952fe9-1eeb-48df-a902-99ca6708f92d/extract-utilities/0.log" Oct 01 17:15:52 crc kubenswrapper[4764]: I1001 17:15:52.389960 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/registry-server/0.log" Oct 01 17:15:52 crc kubenswrapper[4764]: I1001 17:15:52.428480 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6b8k2_a1952fe9-1eeb-48df-a902-99ca6708f92d/extract-content/0.log" Oct 01 17:15:52 crc kubenswrapper[4764]: I1001 17:15:52.475290 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6b8k2_a1952fe9-1eeb-48df-a902-99ca6708f92d/extract-utilities/0.log" Oct 01 17:15:52 crc kubenswrapper[4764]: I1001 17:15:52.513342 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6b8k2_a1952fe9-1eeb-48df-a902-99ca6708f92d/extract-content/0.log" Oct 01 17:15:52 crc kubenswrapper[4764]: I1001 17:15:52.668545 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6b8k2_a1952fe9-1eeb-48df-a902-99ca6708f92d/extract-content/0.log" Oct 01 17:15:52 crc kubenswrapper[4764]: I1001 17:15:52.668640 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6b8k2_a1952fe9-1eeb-48df-a902-99ca6708f92d/extract-utilities/0.log" Oct 01 17:15:53 crc kubenswrapper[4764]: I1001 17:15:53.067262 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6b8k2_a1952fe9-1eeb-48df-a902-99ca6708f92d/registry-server/0.log" Oct 01 17:15:55 crc kubenswrapper[4764]: I1001 17:15:55.618656 4764 scope.go:117] "RemoveContainer" containerID="c5fb19bc9012f9a463edd56225fc928d105aed2a16ef93d28d0f2024593ad666" Oct 01 17:17:21 crc kubenswrapper[4764]: I1001 17:17:21.914219 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:17:21 crc kubenswrapper[4764]: I1001 17:17:21.914890 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 17:17:51 crc kubenswrapper[4764]: I1001 17:17:51.914408 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:17:51 crc kubenswrapper[4764]: I1001 17:17:51.914971 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 17:17:55 crc kubenswrapper[4764]: I1001 17:17:55.695998 4764 scope.go:117] "RemoveContainer" containerID="d57a866f94954bb0b49d80cbc108977f4ea9ed4b988c1a13f4834bf3e833f353" Oct 01 17:18:09 crc kubenswrapper[4764]: I1001 17:18:09.042818 4764 generic.go:334] "Generic (PLEG): container finished" podID="c01ddf33-500f-470a-a3e0-43ce226d3d44" containerID="490cf79ad54ef018ff21a71ef185500ad7c8a0396ad699e38c33616a69eba250" exitCode=0 Oct 01 17:18:09 crc kubenswrapper[4764]: I1001 17:18:09.042893 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5rzcc/must-gather-mjwf9" event={"ID":"c01ddf33-500f-470a-a3e0-43ce226d3d44","Type":"ContainerDied","Data":"490cf79ad54ef018ff21a71ef185500ad7c8a0396ad699e38c33616a69eba250"} Oct 01 17:18:09 crc kubenswrapper[4764]: I1001 17:18:09.044110 4764 scope.go:117] "RemoveContainer" containerID="490cf79ad54ef018ff21a71ef185500ad7c8a0396ad699e38c33616a69eba250" Oct 01 17:18:09 crc kubenswrapper[4764]: I1001 17:18:09.466402 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5rzcc_must-gather-mjwf9_c01ddf33-500f-470a-a3e0-43ce226d3d44/gather/0.log" Oct 01 17:18:17 crc kubenswrapper[4764]: I1001 17:18:17.470147 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5rzcc/must-gather-mjwf9"] Oct 01 17:18:17 crc kubenswrapper[4764]: I1001 17:18:17.470986 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5rzcc/must-gather-mjwf9" podUID="c01ddf33-500f-470a-a3e0-43ce226d3d44" containerName="copy" containerID="cri-o://be4434c6457726053d9801f5672d1f2546429c98701974317484c1f9e540b34e" gracePeriod=2 Oct 01 17:18:17 crc kubenswrapper[4764]: I1001 17:18:17.480574 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5rzcc/must-gather-mjwf9"] Oct 01 17:18:17 crc kubenswrapper[4764]: I1001 17:18:17.959935 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5rzcc_must-gather-mjwf9_c01ddf33-500f-470a-a3e0-43ce226d3d44/copy/0.log" Oct 01 17:18:17 crc kubenswrapper[4764]: I1001 17:18:17.961396 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/must-gather-mjwf9" Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.033191 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kll8d\" (UniqueName: \"kubernetes.io/projected/c01ddf33-500f-470a-a3e0-43ce226d3d44-kube-api-access-kll8d\") pod \"c01ddf33-500f-470a-a3e0-43ce226d3d44\" (UID: \"c01ddf33-500f-470a-a3e0-43ce226d3d44\") " Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.033301 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c01ddf33-500f-470a-a3e0-43ce226d3d44-must-gather-output\") pod \"c01ddf33-500f-470a-a3e0-43ce226d3d44\" (UID: \"c01ddf33-500f-470a-a3e0-43ce226d3d44\") " Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.042281 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c01ddf33-500f-470a-a3e0-43ce226d3d44-kube-api-access-kll8d" (OuterVolumeSpecName: "kube-api-access-kll8d") pod "c01ddf33-500f-470a-a3e0-43ce226d3d44" (UID: "c01ddf33-500f-470a-a3e0-43ce226d3d44"). InnerVolumeSpecName "kube-api-access-kll8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.137532 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kll8d\" (UniqueName: \"kubernetes.io/projected/c01ddf33-500f-470a-a3e0-43ce226d3d44-kube-api-access-kll8d\") on node \"crc\" DevicePath \"\"" Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.153182 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5rzcc_must-gather-mjwf9_c01ddf33-500f-470a-a3e0-43ce226d3d44/copy/0.log" Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.153696 4764 generic.go:334] "Generic (PLEG): container finished" podID="c01ddf33-500f-470a-a3e0-43ce226d3d44" containerID="be4434c6457726053d9801f5672d1f2546429c98701974317484c1f9e540b34e" exitCode=143 Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.153763 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5rzcc/must-gather-mjwf9" Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.153793 4764 scope.go:117] "RemoveContainer" containerID="be4434c6457726053d9801f5672d1f2546429c98701974317484c1f9e540b34e" Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.187073 4764 scope.go:117] "RemoveContainer" containerID="490cf79ad54ef018ff21a71ef185500ad7c8a0396ad699e38c33616a69eba250" Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.190219 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c01ddf33-500f-470a-a3e0-43ce226d3d44-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c01ddf33-500f-470a-a3e0-43ce226d3d44" (UID: "c01ddf33-500f-470a-a3e0-43ce226d3d44"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.240183 4764 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c01ddf33-500f-470a-a3e0-43ce226d3d44-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.246931 4764 scope.go:117] "RemoveContainer" containerID="be4434c6457726053d9801f5672d1f2546429c98701974317484c1f9e540b34e" Oct 01 17:18:18 crc kubenswrapper[4764]: E1001 17:18:18.247413 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be4434c6457726053d9801f5672d1f2546429c98701974317484c1f9e540b34e\": container with ID starting with be4434c6457726053d9801f5672d1f2546429c98701974317484c1f9e540b34e not found: ID does not exist" containerID="be4434c6457726053d9801f5672d1f2546429c98701974317484c1f9e540b34e" Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.247449 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4434c6457726053d9801f5672d1f2546429c98701974317484c1f9e540b34e"} err="failed to get container status \"be4434c6457726053d9801f5672d1f2546429c98701974317484c1f9e540b34e\": rpc error: code = NotFound desc = could not find container \"be4434c6457726053d9801f5672d1f2546429c98701974317484c1f9e540b34e\": container with ID starting with be4434c6457726053d9801f5672d1f2546429c98701974317484c1f9e540b34e not found: ID does not exist" Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.247469 4764 scope.go:117] "RemoveContainer" containerID="490cf79ad54ef018ff21a71ef185500ad7c8a0396ad699e38c33616a69eba250" Oct 01 17:18:18 crc kubenswrapper[4764]: E1001 17:18:18.247862 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"490cf79ad54ef018ff21a71ef185500ad7c8a0396ad699e38c33616a69eba250\": container with ID starting with 490cf79ad54ef018ff21a71ef185500ad7c8a0396ad699e38c33616a69eba250 not found: ID does not exist" containerID="490cf79ad54ef018ff21a71ef185500ad7c8a0396ad699e38c33616a69eba250" Oct 01 17:18:18 crc kubenswrapper[4764]: I1001 17:18:18.247902 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"490cf79ad54ef018ff21a71ef185500ad7c8a0396ad699e38c33616a69eba250"} err="failed to get container status \"490cf79ad54ef018ff21a71ef185500ad7c8a0396ad699e38c33616a69eba250\": rpc error: code = NotFound desc = could not find container \"490cf79ad54ef018ff21a71ef185500ad7c8a0396ad699e38c33616a69eba250\": container with ID starting with 490cf79ad54ef018ff21a71ef185500ad7c8a0396ad699e38c33616a69eba250 not found: ID does not exist" Oct 01 17:18:19 crc kubenswrapper[4764]: I1001 17:18:19.735401 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c01ddf33-500f-470a-a3e0-43ce226d3d44" path="/var/lib/kubelet/pods/c01ddf33-500f-470a-a3e0-43ce226d3d44/volumes" Oct 01 17:18:21 crc kubenswrapper[4764]: I1001 17:18:21.913784 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:18:21 crc kubenswrapper[4764]: I1001 17:18:21.914160 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 17:18:21 crc kubenswrapper[4764]: I1001 17:18:21.914204 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" Oct 01 17:18:22 crc kubenswrapper[4764]: I1001 17:18:22.191392 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22"} pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 17:18:22 crc kubenswrapper[4764]: I1001 17:18:22.191459 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" containerID="cri-o://99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" gracePeriod=600 Oct 01 17:18:22 crc kubenswrapper[4764]: E1001 17:18:22.312688 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:18:23 crc kubenswrapper[4764]: I1001 17:18:23.200083 4764 generic.go:334] "Generic (PLEG): container finished" podID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" exitCode=0 Oct 01 17:18:23 crc kubenswrapper[4764]: I1001 17:18:23.200121 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerDied","Data":"99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22"} Oct 01 17:18:23 crc kubenswrapper[4764]: I1001 17:18:23.200152 4764 scope.go:117] "RemoveContainer" containerID="bb43c1b5b6bf993c96f4515c75fc9c7c8c81d9ee12ec27aa7fc7cbe9f2ccf27b" Oct 01 17:18:23 crc kubenswrapper[4764]: I1001 17:18:23.200764 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:18:23 crc kubenswrapper[4764]: E1001 17:18:23.201100 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:18:37 crc kubenswrapper[4764]: I1001 17:18:37.731191 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:18:37 crc kubenswrapper[4764]: E1001 17:18:37.733015 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.256196 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qkzln"] Oct 01 17:18:45 crc kubenswrapper[4764]: E1001 17:18:45.257213 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01ddf33-500f-470a-a3e0-43ce226d3d44" containerName="copy" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.257227 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01ddf33-500f-470a-a3e0-43ce226d3d44" containerName="copy" Oct 01 17:18:45 crc kubenswrapper[4764]: E1001 17:18:45.257253 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20984268-a4cf-4552-b597-6882f56c415e" containerName="collect-profiles" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.257259 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="20984268-a4cf-4552-b597-6882f56c415e" containerName="collect-profiles" Oct 01 17:18:45 crc kubenswrapper[4764]: E1001 17:18:45.257274 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c01ddf33-500f-470a-a3e0-43ce226d3d44" containerName="gather" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.257280 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c01ddf33-500f-470a-a3e0-43ce226d3d44" containerName="gather" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.257463 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c01ddf33-500f-470a-a3e0-43ce226d3d44" containerName="copy" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.257480 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c01ddf33-500f-470a-a3e0-43ce226d3d44" containerName="gather" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.257502 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="20984268-a4cf-4552-b597-6882f56c415e" containerName="collect-profiles" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.258825 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.277559 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qkzln"] Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.378912 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6rcf\" (UniqueName: \"kubernetes.io/projected/49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f-kube-api-access-d6rcf\") pod \"redhat-operators-qkzln\" (UID: \"49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f\") " pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.378963 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f-catalog-content\") pod \"redhat-operators-qkzln\" (UID: \"49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f\") " pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.379316 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f-utilities\") pod \"redhat-operators-qkzln\" (UID: \"49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f\") " pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.481046 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f-utilities\") pod \"redhat-operators-qkzln\" (UID: \"49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f\") " pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.481217 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6rcf\" (UniqueName: \"kubernetes.io/projected/49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f-kube-api-access-d6rcf\") pod \"redhat-operators-qkzln\" (UID: \"49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f\") " pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.481256 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f-catalog-content\") pod \"redhat-operators-qkzln\" (UID: \"49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f\") " pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.481539 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f-utilities\") pod \"redhat-operators-qkzln\" (UID: \"49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f\") " pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.481593 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f-catalog-content\") pod \"redhat-operators-qkzln\" (UID: \"49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f\") " pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.575336 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6rcf\" (UniqueName: \"kubernetes.io/projected/49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f-kube-api-access-d6rcf\") pod \"redhat-operators-qkzln\" (UID: \"49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f\") " pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:18:45 crc kubenswrapper[4764]: I1001 17:18:45.589605 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:18:46 crc kubenswrapper[4764]: I1001 17:18:46.061527 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qkzln"] Oct 01 17:18:46 crc kubenswrapper[4764]: I1001 17:18:46.411865 4764 generic.go:334] "Generic (PLEG): container finished" podID="49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f" containerID="67fc5b1e6e67defda4b227e7c2eae06bbe48fc2528f94edde6e06185946e94b3" exitCode=0 Oct 01 17:18:46 crc kubenswrapper[4764]: I1001 17:18:46.411968 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qkzln" event={"ID":"49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f","Type":"ContainerDied","Data":"67fc5b1e6e67defda4b227e7c2eae06bbe48fc2528f94edde6e06185946e94b3"} Oct 01 17:18:46 crc kubenswrapper[4764]: I1001 17:18:46.412141 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qkzln" event={"ID":"49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f","Type":"ContainerStarted","Data":"31ab0cc055d2c9c0e57a5bed460d73b2fe41a05db64e0ce8ab86aa7617d154a1"} Oct 01 17:18:46 crc kubenswrapper[4764]: I1001 17:18:46.413911 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 17:18:48 crc kubenswrapper[4764]: I1001 17:18:48.722670 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:18:48 crc kubenswrapper[4764]: E1001 17:18:48.723359 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:18:53 crc kubenswrapper[4764]: I1001 17:18:53.476934 4764 generic.go:334] "Generic (PLEG): container finished" podID="49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f" containerID="da64784a69721c0eb5426203f19ed84ca969230d9c0df0ec0b2f4cf314e34e0a" exitCode=0 Oct 01 17:18:53 crc kubenswrapper[4764]: I1001 17:18:53.477159 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qkzln" event={"ID":"49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f","Type":"ContainerDied","Data":"da64784a69721c0eb5426203f19ed84ca969230d9c0df0ec0b2f4cf314e34e0a"} Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.489954 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qkzln" event={"ID":"49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f","Type":"ContainerStarted","Data":"dd2e2dc8b952d5e52f42ead45231fc4e45e26a7375e98fad07b1bd30a8b006d7"} Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.511985 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qkzln" podStartSLOduration=1.8741915059999998 podStartE2EDuration="9.511965107s" podCreationTimestamp="2025-10-01 17:18:45 +0000 UTC" firstStartedPulling="2025-10-01 17:18:46.413612917 +0000 UTC m=+4589.413259752" lastFinishedPulling="2025-10-01 17:18:54.051386508 +0000 UTC m=+4597.051033353" observedRunningTime="2025-10-01 17:18:54.511171288 +0000 UTC m=+4597.510818123" watchObservedRunningTime="2025-10-01 17:18:54.511965107 +0000 UTC m=+4597.511611942" Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.585128 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v59zf/must-gather-5vzpr"] Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.587229 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/must-gather-5vzpr" Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.594134 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v59zf/must-gather-5vzpr"] Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.594929 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v59zf"/"kube-root-ca.crt" Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.595158 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v59zf"/"openshift-service-ca.crt" Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.676431 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d047901f-1b7a-4204-8764-67631f06b45d-must-gather-output\") pod \"must-gather-5vzpr\" (UID: \"d047901f-1b7a-4204-8764-67631f06b45d\") " pod="openshift-must-gather-v59zf/must-gather-5vzpr" Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.676503 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b67hp\" (UniqueName: \"kubernetes.io/projected/d047901f-1b7a-4204-8764-67631f06b45d-kube-api-access-b67hp\") pod \"must-gather-5vzpr\" (UID: \"d047901f-1b7a-4204-8764-67631f06b45d\") " pod="openshift-must-gather-v59zf/must-gather-5vzpr" Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.778934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d047901f-1b7a-4204-8764-67631f06b45d-must-gather-output\") pod \"must-gather-5vzpr\" (UID: \"d047901f-1b7a-4204-8764-67631f06b45d\") " pod="openshift-must-gather-v59zf/must-gather-5vzpr" Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.779302 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b67hp\" (UniqueName: \"kubernetes.io/projected/d047901f-1b7a-4204-8764-67631f06b45d-kube-api-access-b67hp\") pod \"must-gather-5vzpr\" (UID: \"d047901f-1b7a-4204-8764-67631f06b45d\") " pod="openshift-must-gather-v59zf/must-gather-5vzpr" Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.779724 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d047901f-1b7a-4204-8764-67631f06b45d-must-gather-output\") pod \"must-gather-5vzpr\" (UID: \"d047901f-1b7a-4204-8764-67631f06b45d\") " pod="openshift-must-gather-v59zf/must-gather-5vzpr" Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.806668 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b67hp\" (UniqueName: \"kubernetes.io/projected/d047901f-1b7a-4204-8764-67631f06b45d-kube-api-access-b67hp\") pod \"must-gather-5vzpr\" (UID: \"d047901f-1b7a-4204-8764-67631f06b45d\") " pod="openshift-must-gather-v59zf/must-gather-5vzpr" Oct 01 17:18:54 crc kubenswrapper[4764]: I1001 17:18:54.910018 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/must-gather-5vzpr" Oct 01 17:18:55 crc kubenswrapper[4764]: I1001 17:18:55.477941 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v59zf/must-gather-5vzpr"] Oct 01 17:18:55 crc kubenswrapper[4764]: I1001 17:18:55.501946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v59zf/must-gather-5vzpr" event={"ID":"d047901f-1b7a-4204-8764-67631f06b45d","Type":"ContainerStarted","Data":"0c09a94b5fefe57d3e76536a3e96b595c884b8902de149efb66808e43a6fcf19"} Oct 01 17:18:55 crc kubenswrapper[4764]: I1001 17:18:55.591273 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:18:55 crc kubenswrapper[4764]: I1001 17:18:55.591634 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:18:56 crc kubenswrapper[4764]: I1001 17:18:56.518120 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v59zf/must-gather-5vzpr" event={"ID":"d047901f-1b7a-4204-8764-67631f06b45d","Type":"ContainerStarted","Data":"37ab9be63c2b7e4d5aff0ae89af05f455a33dfbc033c24b15abeccb9da78afe8"} Oct 01 17:18:56 crc kubenswrapper[4764]: I1001 17:18:56.518510 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v59zf/must-gather-5vzpr" event={"ID":"d047901f-1b7a-4204-8764-67631f06b45d","Type":"ContainerStarted","Data":"df5f58be06844d97f4a108f354136799efb1dfb2878f632725ecccb07a5b4639"} Oct 01 17:18:56 crc kubenswrapper[4764]: I1001 17:18:56.537394 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v59zf/must-gather-5vzpr" podStartSLOduration=2.537365546 podStartE2EDuration="2.537365546s" podCreationTimestamp="2025-10-01 17:18:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 17:18:56.533202303 +0000 UTC m=+4599.532849138" watchObservedRunningTime="2025-10-01 17:18:56.537365546 +0000 UTC m=+4599.537012421" Oct 01 17:18:56 crc kubenswrapper[4764]: I1001 17:18:56.654319 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qkzln" podUID="49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f" containerName="registry-server" probeResult="failure" output=< Oct 01 17:18:56 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Oct 01 17:18:56 crc kubenswrapper[4764]: > Oct 01 17:19:00 crc kubenswrapper[4764]: I1001 17:19:00.572663 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v59zf/crc-debug-lskbd"] Oct 01 17:19:00 crc kubenswrapper[4764]: I1001 17:19:00.574089 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/crc-debug-lskbd" Oct 01 17:19:00 crc kubenswrapper[4764]: I1001 17:19:00.579034 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v59zf"/"default-dockercfg-wr99z" Oct 01 17:19:00 crc kubenswrapper[4764]: I1001 17:19:00.719037 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/972d86d6-ca6f-44b8-b75d-d8a453e11843-host\") pod \"crc-debug-lskbd\" (UID: \"972d86d6-ca6f-44b8-b75d-d8a453e11843\") " pod="openshift-must-gather-v59zf/crc-debug-lskbd" Oct 01 17:19:00 crc kubenswrapper[4764]: I1001 17:19:00.719182 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8c72\" (UniqueName: \"kubernetes.io/projected/972d86d6-ca6f-44b8-b75d-d8a453e11843-kube-api-access-w8c72\") pod \"crc-debug-lskbd\" (UID: \"972d86d6-ca6f-44b8-b75d-d8a453e11843\") " pod="openshift-must-gather-v59zf/crc-debug-lskbd" Oct 01 17:19:00 crc kubenswrapper[4764]: I1001 17:19:00.821343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8c72\" (UniqueName: \"kubernetes.io/projected/972d86d6-ca6f-44b8-b75d-d8a453e11843-kube-api-access-w8c72\") pod \"crc-debug-lskbd\" (UID: \"972d86d6-ca6f-44b8-b75d-d8a453e11843\") " pod="openshift-must-gather-v59zf/crc-debug-lskbd" Oct 01 17:19:00 crc kubenswrapper[4764]: I1001 17:19:00.821695 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/972d86d6-ca6f-44b8-b75d-d8a453e11843-host\") pod \"crc-debug-lskbd\" (UID: \"972d86d6-ca6f-44b8-b75d-d8a453e11843\") " pod="openshift-must-gather-v59zf/crc-debug-lskbd" Oct 01 17:19:00 crc kubenswrapper[4764]: I1001 17:19:00.822924 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/972d86d6-ca6f-44b8-b75d-d8a453e11843-host\") pod \"crc-debug-lskbd\" (UID: \"972d86d6-ca6f-44b8-b75d-d8a453e11843\") " pod="openshift-must-gather-v59zf/crc-debug-lskbd" Oct 01 17:19:00 crc kubenswrapper[4764]: I1001 17:19:00.850266 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8c72\" (UniqueName: \"kubernetes.io/projected/972d86d6-ca6f-44b8-b75d-d8a453e11843-kube-api-access-w8c72\") pod \"crc-debug-lskbd\" (UID: \"972d86d6-ca6f-44b8-b75d-d8a453e11843\") " pod="openshift-must-gather-v59zf/crc-debug-lskbd" Oct 01 17:19:00 crc kubenswrapper[4764]: I1001 17:19:00.890807 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/crc-debug-lskbd" Oct 01 17:19:00 crc kubenswrapper[4764]: W1001 17:19:00.921143 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod972d86d6_ca6f_44b8_b75d_d8a453e11843.slice/crio-e22df57bfed32007e4759b2a9344d62732b70f69865a9823c668aa9683abfdae WatchSource:0}: Error finding container e22df57bfed32007e4759b2a9344d62732b70f69865a9823c668aa9683abfdae: Status 404 returned error can't find the container with id e22df57bfed32007e4759b2a9344d62732b70f69865a9823c668aa9683abfdae Oct 01 17:19:01 crc kubenswrapper[4764]: I1001 17:19:01.568979 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v59zf/crc-debug-lskbd" event={"ID":"972d86d6-ca6f-44b8-b75d-d8a453e11843","Type":"ContainerStarted","Data":"c2ab0a8d8731c1ede0182636a5ea166c683951461fe084b8729aab8b99d6c98e"} Oct 01 17:19:01 crc kubenswrapper[4764]: I1001 17:19:01.569415 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v59zf/crc-debug-lskbd" event={"ID":"972d86d6-ca6f-44b8-b75d-d8a453e11843","Type":"ContainerStarted","Data":"e22df57bfed32007e4759b2a9344d62732b70f69865a9823c668aa9683abfdae"} Oct 01 17:19:01 crc kubenswrapper[4764]: I1001 17:19:01.584447 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v59zf/crc-debug-lskbd" podStartSLOduration=1.58443017 podStartE2EDuration="1.58443017s" podCreationTimestamp="2025-10-01 17:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 17:19:01.581349054 +0000 UTC m=+4604.580995889" watchObservedRunningTime="2025-10-01 17:19:01.58443017 +0000 UTC m=+4604.584077005" Oct 01 17:19:02 crc kubenswrapper[4764]: I1001 17:19:02.722210 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:19:02 crc kubenswrapper[4764]: E1001 17:19:02.722764 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:19:05 crc kubenswrapper[4764]: I1001 17:19:05.643249 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:19:05 crc kubenswrapper[4764]: I1001 17:19:05.690996 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qkzln" Oct 01 17:19:05 crc kubenswrapper[4764]: I1001 17:19:05.754810 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qkzln"] Oct 01 17:19:05 crc kubenswrapper[4764]: I1001 17:19:05.878725 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6b8k2"] Oct 01 17:19:05 crc kubenswrapper[4764]: I1001 17:19:05.879015 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6b8k2" podUID="a1952fe9-1eeb-48df-a902-99ca6708f92d" containerName="registry-server" containerID="cri-o://36cb50f081a79e377af3dac542b26708fffff7cf3a1354f7b3036c465c6d70c1" gracePeriod=2 Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.423599 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.549410 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1952fe9-1eeb-48df-a902-99ca6708f92d-catalog-content\") pod \"a1952fe9-1eeb-48df-a902-99ca6708f92d\" (UID: \"a1952fe9-1eeb-48df-a902-99ca6708f92d\") " Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.549970 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7jqv\" (UniqueName: \"kubernetes.io/projected/a1952fe9-1eeb-48df-a902-99ca6708f92d-kube-api-access-s7jqv\") pod \"a1952fe9-1eeb-48df-a902-99ca6708f92d\" (UID: \"a1952fe9-1eeb-48df-a902-99ca6708f92d\") " Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.550085 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1952fe9-1eeb-48df-a902-99ca6708f92d-utilities\") pod \"a1952fe9-1eeb-48df-a902-99ca6708f92d\" (UID: \"a1952fe9-1eeb-48df-a902-99ca6708f92d\") " Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.554384 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1952fe9-1eeb-48df-a902-99ca6708f92d-utilities" (OuterVolumeSpecName: "utilities") pod "a1952fe9-1eeb-48df-a902-99ca6708f92d" (UID: "a1952fe9-1eeb-48df-a902-99ca6708f92d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.581298 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1952fe9-1eeb-48df-a902-99ca6708f92d-kube-api-access-s7jqv" (OuterVolumeSpecName: "kube-api-access-s7jqv") pod "a1952fe9-1eeb-48df-a902-99ca6708f92d" (UID: "a1952fe9-1eeb-48df-a902-99ca6708f92d"). InnerVolumeSpecName "kube-api-access-s7jqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.617236 4764 generic.go:334] "Generic (PLEG): container finished" podID="a1952fe9-1eeb-48df-a902-99ca6708f92d" containerID="36cb50f081a79e377af3dac542b26708fffff7cf3a1354f7b3036c465c6d70c1" exitCode=0 Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.620090 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6b8k2" Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.620017 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b8k2" event={"ID":"a1952fe9-1eeb-48df-a902-99ca6708f92d","Type":"ContainerDied","Data":"36cb50f081a79e377af3dac542b26708fffff7cf3a1354f7b3036c465c6d70c1"} Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.620240 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6b8k2" event={"ID":"a1952fe9-1eeb-48df-a902-99ca6708f92d","Type":"ContainerDied","Data":"5326af3d6ae5b2bc22be4eedc6a76cb09a59ee7836c7c9d398320afc7f82970e"} Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.620286 4764 scope.go:117] "RemoveContainer" containerID="36cb50f081a79e377af3dac542b26708fffff7cf3a1354f7b3036c465c6d70c1" Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.631857 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1952fe9-1eeb-48df-a902-99ca6708f92d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1952fe9-1eeb-48df-a902-99ca6708f92d" (UID: "a1952fe9-1eeb-48df-a902-99ca6708f92d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.652105 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7jqv\" (UniqueName: \"kubernetes.io/projected/a1952fe9-1eeb-48df-a902-99ca6708f92d-kube-api-access-s7jqv\") on node \"crc\" DevicePath \"\"" Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.652155 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1952fe9-1eeb-48df-a902-99ca6708f92d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.652166 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1952fe9-1eeb-48df-a902-99ca6708f92d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.654081 4764 scope.go:117] "RemoveContainer" containerID="02cc9ccefb556142af060573f9e03c1cc31a3164581e38cf10ea9899855ca93f" Oct 01 17:19:06 crc kubenswrapper[4764]: I1001 17:19:06.683523 4764 scope.go:117] "RemoveContainer" containerID="29473e094b389c2c5896304628e8f8f55bc26a9ac5d566e7478fb35dc2054ca4" Oct 01 17:19:07 crc kubenswrapper[4764]: I1001 17:19:07.438166 4764 scope.go:117] "RemoveContainer" containerID="36cb50f081a79e377af3dac542b26708fffff7cf3a1354f7b3036c465c6d70c1" Oct 01 17:19:07 crc kubenswrapper[4764]: E1001 17:19:07.440713 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36cb50f081a79e377af3dac542b26708fffff7cf3a1354f7b3036c465c6d70c1\": container with ID starting with 36cb50f081a79e377af3dac542b26708fffff7cf3a1354f7b3036c465c6d70c1 not found: ID does not exist" containerID="36cb50f081a79e377af3dac542b26708fffff7cf3a1354f7b3036c465c6d70c1" Oct 01 17:19:07 crc kubenswrapper[4764]: I1001 17:19:07.440768 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36cb50f081a79e377af3dac542b26708fffff7cf3a1354f7b3036c465c6d70c1"} err="failed to get container status \"36cb50f081a79e377af3dac542b26708fffff7cf3a1354f7b3036c465c6d70c1\": rpc error: code = NotFound desc = could not find container \"36cb50f081a79e377af3dac542b26708fffff7cf3a1354f7b3036c465c6d70c1\": container with ID starting with 36cb50f081a79e377af3dac542b26708fffff7cf3a1354f7b3036c465c6d70c1 not found: ID does not exist" Oct 01 17:19:07 crc kubenswrapper[4764]: I1001 17:19:07.440812 4764 scope.go:117] "RemoveContainer" containerID="02cc9ccefb556142af060573f9e03c1cc31a3164581e38cf10ea9899855ca93f" Oct 01 17:19:07 crc kubenswrapper[4764]: E1001 17:19:07.441190 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02cc9ccefb556142af060573f9e03c1cc31a3164581e38cf10ea9899855ca93f\": container with ID starting with 02cc9ccefb556142af060573f9e03c1cc31a3164581e38cf10ea9899855ca93f not found: ID does not exist" containerID="02cc9ccefb556142af060573f9e03c1cc31a3164581e38cf10ea9899855ca93f" Oct 01 17:19:07 crc kubenswrapper[4764]: I1001 17:19:07.441245 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cc9ccefb556142af060573f9e03c1cc31a3164581e38cf10ea9899855ca93f"} err="failed to get container status \"02cc9ccefb556142af060573f9e03c1cc31a3164581e38cf10ea9899855ca93f\": rpc error: code = NotFound desc = could not find container \"02cc9ccefb556142af060573f9e03c1cc31a3164581e38cf10ea9899855ca93f\": container with ID starting with 02cc9ccefb556142af060573f9e03c1cc31a3164581e38cf10ea9899855ca93f not found: ID does not exist" Oct 01 17:19:07 crc kubenswrapper[4764]: I1001 17:19:07.441260 4764 scope.go:117] "RemoveContainer" containerID="29473e094b389c2c5896304628e8f8f55bc26a9ac5d566e7478fb35dc2054ca4" Oct 01 17:19:07 crc kubenswrapper[4764]: E1001 17:19:07.441547 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29473e094b389c2c5896304628e8f8f55bc26a9ac5d566e7478fb35dc2054ca4\": container with ID starting with 29473e094b389c2c5896304628e8f8f55bc26a9ac5d566e7478fb35dc2054ca4 not found: ID does not exist" containerID="29473e094b389c2c5896304628e8f8f55bc26a9ac5d566e7478fb35dc2054ca4" Oct 01 17:19:07 crc kubenswrapper[4764]: I1001 17:19:07.441575 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29473e094b389c2c5896304628e8f8f55bc26a9ac5d566e7478fb35dc2054ca4"} err="failed to get container status \"29473e094b389c2c5896304628e8f8f55bc26a9ac5d566e7478fb35dc2054ca4\": rpc error: code = NotFound desc = could not find container \"29473e094b389c2c5896304628e8f8f55bc26a9ac5d566e7478fb35dc2054ca4\": container with ID starting with 29473e094b389c2c5896304628e8f8f55bc26a9ac5d566e7478fb35dc2054ca4 not found: ID does not exist" Oct 01 17:19:07 crc kubenswrapper[4764]: I1001 17:19:07.511237 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6b8k2"] Oct 01 17:19:07 crc kubenswrapper[4764]: I1001 17:19:07.528329 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6b8k2"] Oct 01 17:19:07 crc kubenswrapper[4764]: I1001 17:19:07.742506 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1952fe9-1eeb-48df-a902-99ca6708f92d" path="/var/lib/kubelet/pods/a1952fe9-1eeb-48df-a902-99ca6708f92d/volumes" Oct 01 17:19:15 crc kubenswrapper[4764]: I1001 17:19:15.723026 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:19:15 crc kubenswrapper[4764]: E1001 17:19:15.723848 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:19:28 crc kubenswrapper[4764]: I1001 17:19:28.721911 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:19:28 crc kubenswrapper[4764]: E1001 17:19:28.722931 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:19:40 crc kubenswrapper[4764]: I1001 17:19:40.721499 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:19:40 crc kubenswrapper[4764]: E1001 17:19:40.723518 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:19:51 crc kubenswrapper[4764]: I1001 17:19:51.722668 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:19:51 crc kubenswrapper[4764]: E1001 17:19:51.724621 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:20:02 crc kubenswrapper[4764]: I1001 17:20:02.722535 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:20:02 crc kubenswrapper[4764]: E1001 17:20:02.723317 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:20:07 crc kubenswrapper[4764]: I1001 17:20:07.194424 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6bf96b65c4-djgxs_adede25f-2ef2-4d24-a18b-93865063b49f/barbican-api/0.log" Oct 01 17:20:07 crc kubenswrapper[4764]: I1001 17:20:07.370610 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6bf96b65c4-djgxs_adede25f-2ef2-4d24-a18b-93865063b49f/barbican-api-log/0.log" Oct 01 17:20:07 crc kubenswrapper[4764]: I1001 17:20:07.550420 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-557bf5c9c4-gn9s8_446019eb-78e7-4c76-983b-44a968141080/barbican-keystone-listener/0.log" Oct 01 17:20:07 crc kubenswrapper[4764]: I1001 17:20:07.950922 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-557bf5c9c4-gn9s8_446019eb-78e7-4c76-983b-44a968141080/barbican-keystone-listener-log/0.log" Oct 01 17:20:07 crc kubenswrapper[4764]: I1001 17:20:07.958446 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56949f8bfc-chjrr_f4930a41-989a-4747-b659-f35df5f73bd0/barbican-worker/0.log" Oct 01 17:20:08 crc kubenswrapper[4764]: I1001 17:20:08.168606 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56949f8bfc-chjrr_f4930a41-989a-4747-b659-f35df5f73bd0/barbican-worker-log/0.log" Oct 01 17:20:08 crc kubenswrapper[4764]: I1001 17:20:08.363692 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-fl62m_abe7d369-08b8-431b-9b66-3b6056a37e00/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:08 crc kubenswrapper[4764]: I1001 17:20:08.565935 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_51273dda-10be-4519-adeb-992fa5936387/ceilometer-central-agent/0.log" Oct 01 17:20:08 crc kubenswrapper[4764]: I1001 17:20:08.656561 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_51273dda-10be-4519-adeb-992fa5936387/ceilometer-notification-agent/0.log" Oct 01 17:20:08 crc kubenswrapper[4764]: I1001 17:20:08.731497 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_51273dda-10be-4519-adeb-992fa5936387/proxy-httpd/0.log" Oct 01 17:20:08 crc kubenswrapper[4764]: I1001 17:20:08.852966 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_51273dda-10be-4519-adeb-992fa5936387/sg-core/0.log" Oct 01 17:20:09 crc kubenswrapper[4764]: I1001 17:20:09.041573 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-gxtkq_c539a876-f4e2-41db-aa15-6a54e4ac75c6/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:09 crc kubenswrapper[4764]: I1001 17:20:09.277299 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-gwcvl_c353fd70-5d43-4e79-9863-9d1c4156df15/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:09 crc kubenswrapper[4764]: I1001 17:20:09.517853 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7934a61c-2af3-4c51-987b-411ee1c7645f/cinder-api/0.log" Oct 01 17:20:09 crc kubenswrapper[4764]: I1001 17:20:09.598857 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7934a61c-2af3-4c51-987b-411ee1c7645f/cinder-api-log/0.log" Oct 01 17:20:09 crc kubenswrapper[4764]: I1001 17:20:09.938330 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_8b47ff15-96b3-49ac-a5e5-1ce1051d53a0/probe/0.log" Oct 01 17:20:10 crc kubenswrapper[4764]: I1001 17:20:10.138398 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_8b47ff15-96b3-49ac-a5e5-1ce1051d53a0/cinder-backup/0.log" Oct 01 17:20:10 crc kubenswrapper[4764]: I1001 17:20:10.192004 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cb0e912d-791f-436a-9e94-1e60281b6654/cinder-scheduler/0.log" Oct 01 17:20:10 crc kubenswrapper[4764]: I1001 17:20:10.278863 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cb0e912d-791f-436a-9e94-1e60281b6654/probe/0.log" Oct 01 17:20:10 crc kubenswrapper[4764]: I1001 17:20:10.477705 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_f4b10553-8ea5-49bf-96cf-c22620f1ced3/cinder-volume/0.log" Oct 01 17:20:10 crc kubenswrapper[4764]: I1001 17:20:10.519177 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_f4b10553-8ea5-49bf-96cf-c22620f1ced3/probe/0.log" Oct 01 17:20:10 crc kubenswrapper[4764]: I1001 17:20:10.654362 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ht5hf_4f55b6f9-370f-489f-9bfd-989fbc5cd8b9/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:10 crc kubenswrapper[4764]: I1001 17:20:10.864310 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-57zm8_e02e8e56-086f-4152-accb-b8ffdb55a215/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:10 crc kubenswrapper[4764]: I1001 17:20:10.948890 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-f4vwb_4d7d3e82-e4f3-48e1-99ac-949325fec6cb/init/0.log" Oct 01 17:20:11 crc kubenswrapper[4764]: I1001 17:20:11.098587 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-f4vwb_4d7d3e82-e4f3-48e1-99ac-949325fec6cb/init/0.log" Oct 01 17:20:11 crc kubenswrapper[4764]: I1001 17:20:11.146212 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-f4vwb_4d7d3e82-e4f3-48e1-99ac-949325fec6cb/dnsmasq-dns/0.log" Oct 01 17:20:11 crc kubenswrapper[4764]: I1001 17:20:11.266720 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_48929539-7e51-4e55-bf3f-d168cab2e600/glance-httpd/0.log" Oct 01 17:20:11 crc kubenswrapper[4764]: I1001 17:20:11.322302 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_48929539-7e51-4e55-bf3f-d168cab2e600/glance-log/0.log" Oct 01 17:20:11 crc kubenswrapper[4764]: I1001 17:20:11.443736 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c9860202-23f9-492f-b6b7-fd90d113ad6d/glance-httpd/0.log" Oct 01 17:20:11 crc kubenswrapper[4764]: I1001 17:20:11.498920 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c9860202-23f9-492f-b6b7-fd90d113ad6d/glance-log/0.log" Oct 01 17:20:11 crc kubenswrapper[4764]: I1001 17:20:11.750739 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-554f5d45dd-s9w79_b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d/horizon/0.log" Oct 01 17:20:11 crc kubenswrapper[4764]: I1001 17:20:11.948202 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-554f5d45dd-s9w79_b89cb37b-cedf-4fd8-bfd5-1b6c78a4ef2d/horizon-log/0.log" Oct 01 17:20:11 crc kubenswrapper[4764]: I1001 17:20:11.984252 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-lvr9f_c6f12828-d7f8-45a2-932c-b866030ce666/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:12 crc kubenswrapper[4764]: I1001 17:20:12.119177 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-wtrjr_a0273bd3-26f6-44d9-a665-75c9eac2cf98/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:12 crc kubenswrapper[4764]: I1001 17:20:12.490751 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29322301-kgbcw_c858aedc-be1d-4bd6-8c80-906c5345a7df/keystone-cron/0.log" Oct 01 17:20:12 crc kubenswrapper[4764]: I1001 17:20:12.749339 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_a1a83996-de2f-4abe-a075-8c0c2191eb7b/kube-state-metrics/0.log" Oct 01 17:20:12 crc kubenswrapper[4764]: I1001 17:20:12.971862 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-78f7fcb65-9gxk4_3b093936-cdfc-4f2c-a8a6-86820b145b73/keystone-api/0.log" Oct 01 17:20:13 crc kubenswrapper[4764]: I1001 17:20:13.031277 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-d4f4d_7a7a0b62-c0f8-4e7b-8a3c-40a5efe8fdf7/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:13 crc kubenswrapper[4764]: I1001 17:20:13.663863 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e/manila-api/0.log" Oct 01 17:20:14 crc kubenswrapper[4764]: I1001 17:20:14.079214 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_0f9e419b-d3d5-4813-a276-aac1f18ef4f4/probe/0.log" Oct 01 17:20:14 crc kubenswrapper[4764]: I1001 17:20:14.100931 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_aa5e6b13-1f1e-47bb-81ac-9c333fd1dc0e/manila-api-log/0.log" Oct 01 17:20:14 crc kubenswrapper[4764]: I1001 17:20:14.133809 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_0f9e419b-d3d5-4813-a276-aac1f18ef4f4/manila-scheduler/0.log" Oct 01 17:20:14 crc kubenswrapper[4764]: I1001 17:20:14.361784 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_ca31e39f-7fcb-4389-882d-cfa2d4491df4/probe/0.log" Oct 01 17:20:14 crc kubenswrapper[4764]: I1001 17:20:14.516231 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_ca31e39f-7fcb-4389-882d-cfa2d4491df4/manila-share/0.log" Oct 01 17:20:14 crc kubenswrapper[4764]: I1001 17:20:14.843234 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5d577ff6cf-5gk59_acde2ba2-32bc-4d80-aa8d-dd6505c14da3/neutron-api/0.log" Oct 01 17:20:15 crc kubenswrapper[4764]: I1001 17:20:15.038270 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5d577ff6cf-5gk59_acde2ba2-32bc-4d80-aa8d-dd6505c14da3/neutron-httpd/0.log" Oct 01 17:20:15 crc kubenswrapper[4764]: I1001 17:20:15.551883 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-ffgmw_199ab555-85f7-4168-9e83-a5060e006dc4/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:16 crc kubenswrapper[4764]: I1001 17:20:16.192832 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_445abcb4-96ed-403c-bf18-0c1bc5440182/nova-api-log/0.log" Oct 01 17:20:16 crc kubenswrapper[4764]: I1001 17:20:16.563208 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_445abcb4-96ed-403c-bf18-0c1bc5440182/nova-api-api/0.log" Oct 01 17:20:16 crc kubenswrapper[4764]: I1001 17:20:16.722329 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:20:16 crc kubenswrapper[4764]: E1001 17:20:16.723061 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:20:16 crc kubenswrapper[4764]: I1001 17:20:16.924692 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_eb3c7b0f-1d91-4e76-9b15-3322d2dfd3fa/nova-cell0-conductor-conductor/0.log" Oct 01 17:20:17 crc kubenswrapper[4764]: I1001 17:20:17.497631 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_78988164-5797-4cee-a8a9-7f87adeb170a/nova-cell1-conductor-conductor/0.log" Oct 01 17:20:17 crc kubenswrapper[4764]: I1001 17:20:17.511587 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_384d37eb-2732-48d4-b38d-2befbd3d0cce/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 17:20:17 crc kubenswrapper[4764]: I1001 17:20:17.810387 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-9dvfh_473bdd59-1196-45be-931d-f452ce6bc2fa/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:17 crc kubenswrapper[4764]: I1001 17:20:17.863854 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7faef6b6-44c8-4251-981a-ca6f0eddeda1/nova-metadata-log/0.log" Oct 01 17:20:18 crc kubenswrapper[4764]: I1001 17:20:18.281891 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6e705f16-7e06-46aa-a290-f42760df1c2c/nova-scheduler-scheduler/0.log" Oct 01 17:20:18 crc kubenswrapper[4764]: I1001 17:20:18.501433 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca/mysql-bootstrap/0.log" Oct 01 17:20:18 crc kubenswrapper[4764]: I1001 17:20:18.728369 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca/mysql-bootstrap/0.log" Oct 01 17:20:18 crc kubenswrapper[4764]: I1001 17:20:18.815563 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4c6ca82c-6ff1-48ba-b3c1-e07432ac2cca/galera/0.log" Oct 01 17:20:19 crc kubenswrapper[4764]: I1001 17:20:19.078534 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4acbf2ba-c326-445b-b6f6-11458a1dfb68/mysql-bootstrap/0.log" Oct 01 17:20:19 crc kubenswrapper[4764]: I1001 17:20:19.274436 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4acbf2ba-c326-445b-b6f6-11458a1dfb68/mysql-bootstrap/0.log" Oct 01 17:20:19 crc kubenswrapper[4764]: I1001 17:20:19.351033 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4acbf2ba-c326-445b-b6f6-11458a1dfb68/galera/0.log" Oct 01 17:20:19 crc kubenswrapper[4764]: I1001 17:20:19.587538 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_2d91465c-097e-4579-a5de-df0547d06dbf/openstackclient/0.log" Oct 01 17:20:19 crc kubenswrapper[4764]: I1001 17:20:19.786865 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-r8kfk_80dd0b93-add9-4524-8cef-a32b4250e094/openstack-network-exporter/0.log" Oct 01 17:20:19 crc kubenswrapper[4764]: I1001 17:20:19.799017 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7faef6b6-44c8-4251-981a-ca6f0eddeda1/nova-metadata-metadata/0.log" Oct 01 17:20:19 crc kubenswrapper[4764]: I1001 17:20:19.988870 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cfp4k_da235e7c-70c7-4e8e-bf34-260bfc0cb986/ovsdb-server-init/0.log" Oct 01 17:20:20 crc kubenswrapper[4764]: I1001 17:20:20.257985 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cfp4k_da235e7c-70c7-4e8e-bf34-260bfc0cb986/ovsdb-server-init/0.log" Oct 01 17:20:20 crc kubenswrapper[4764]: I1001 17:20:20.260729 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cfp4k_da235e7c-70c7-4e8e-bf34-260bfc0cb986/ovs-vswitchd/0.log" Oct 01 17:20:20 crc kubenswrapper[4764]: I1001 17:20:20.290780 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cfp4k_da235e7c-70c7-4e8e-bf34-260bfc0cb986/ovsdb-server/0.log" Oct 01 17:20:20 crc kubenswrapper[4764]: I1001 17:20:20.521469 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-wjjhq_79962852-f159-44df-bd50-38928f3df91d/ovn-controller/0.log" Oct 01 17:20:20 crc kubenswrapper[4764]: I1001 17:20:20.723047 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xjmq8_ad66f863-1f1b-40f8-8a3f-464eaf32a344/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:20 crc kubenswrapper[4764]: I1001 17:20:20.778813 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ca14b5f3-e2fc-4fc1-9800-d64209a4c266/openstack-network-exporter/0.log" Oct 01 17:20:20 crc kubenswrapper[4764]: I1001 17:20:20.941450 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ca14b5f3-e2fc-4fc1-9800-d64209a4c266/ovn-northd/0.log" Oct 01 17:20:21 crc kubenswrapper[4764]: I1001 17:20:21.014276 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e870edc1-ed3c-4c16-8c20-cde661ac4ce0/openstack-network-exporter/0.log" Oct 01 17:20:21 crc kubenswrapper[4764]: I1001 17:20:21.144529 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e870edc1-ed3c-4c16-8c20-cde661ac4ce0/ovsdbserver-nb/0.log" Oct 01 17:20:21 crc kubenswrapper[4764]: I1001 17:20:21.233191 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_923cf4a9-e116-4a84-ae06-dec150a649bc/openstack-network-exporter/0.log" Oct 01 17:20:21 crc kubenswrapper[4764]: I1001 17:20:21.401151 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_923cf4a9-e116-4a84-ae06-dec150a649bc/ovsdbserver-sb/0.log" Oct 01 17:20:21 crc kubenswrapper[4764]: I1001 17:20:21.526592 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8695dd9c7b-mwdsh_d43038a4-064b-4ecf-bebf-0f4d6116a839/placement-api/0.log" Oct 01 17:20:21 crc kubenswrapper[4764]: I1001 17:20:21.642477 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8695dd9c7b-mwdsh_d43038a4-064b-4ecf-bebf-0f4d6116a839/placement-log/0.log" Oct 01 17:20:21 crc kubenswrapper[4764]: I1001 17:20:21.723311 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e6c99317-e5aa-4c87-a45a-34e4d14846e4/setup-container/0.log" Oct 01 17:20:22 crc kubenswrapper[4764]: I1001 17:20:22.020294 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e6c99317-e5aa-4c87-a45a-34e4d14846e4/setup-container/0.log" Oct 01 17:20:22 crc kubenswrapper[4764]: I1001 17:20:22.046268 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e6c99317-e5aa-4c87-a45a-34e4d14846e4/rabbitmq/0.log" Oct 01 17:20:22 crc kubenswrapper[4764]: I1001 17:20:22.214375 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_17487462-b952-4428-a875-61732b895017/setup-container/0.log" Oct 01 17:20:22 crc kubenswrapper[4764]: I1001 17:20:22.490344 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_17487462-b952-4428-a875-61732b895017/setup-container/0.log" Oct 01 17:20:22 crc kubenswrapper[4764]: I1001 17:20:22.529953 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_17487462-b952-4428-a875-61732b895017/rabbitmq/0.log" Oct 01 17:20:22 crc kubenswrapper[4764]: I1001 17:20:22.714096 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-qw7nc_a20f9deb-1422-462b-81d1-89cfef47f81d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:22 crc kubenswrapper[4764]: I1001 17:20:22.748656 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-f28wf_839c68a1-2404-4037-8975-58e6b02ba81f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:22 crc kubenswrapper[4764]: I1001 17:20:22.951720 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bj86q_a0fbe741-65d7-464f-b6c6-ecdb60f8bb21/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:23 crc kubenswrapper[4764]: I1001 17:20:23.226872 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-5rdtx_56678028-55b3-410f-a642-999c1f035e88/ssh-known-hosts-edpm-deployment/0.log" Oct 01 17:20:23 crc kubenswrapper[4764]: I1001 17:20:23.295219 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0d43e380-092b-4488-9956-0ca607448dd4/tempest-tests-tempest-tests-runner/0.log" Oct 01 17:20:23 crc kubenswrapper[4764]: I1001 17:20:23.456087 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_28fe40a4-f653-4273-b45c-7b9503d9704f/test-operator-logs-container/0.log" Oct 01 17:20:23 crc kubenswrapper[4764]: I1001 17:20:23.637903 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-cmqdt_0134afb9-9d23-47e6-9d46-6a025c3a3a57/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 17:20:28 crc kubenswrapper[4764]: I1001 17:20:28.722744 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:20:28 crc kubenswrapper[4764]: E1001 17:20:28.723568 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:20:39 crc kubenswrapper[4764]: I1001 17:20:39.331985 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c346177b-4aeb-43b2-8f86-ce57d0d42c10/memcached/0.log" Oct 01 17:20:43 crc kubenswrapper[4764]: I1001 17:20:43.722425 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:20:43 crc kubenswrapper[4764]: E1001 17:20:43.723336 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:20:55 crc kubenswrapper[4764]: I1001 17:20:55.990868 4764 scope.go:117] "RemoveContainer" containerID="6fbf80603f25504118c76cfe9a7b047f01700edda9b0fb4a20306a81a6497fc7" Oct 01 17:20:58 crc kubenswrapper[4764]: I1001 17:20:58.722742 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:20:58 crc kubenswrapper[4764]: E1001 17:20:58.724626 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:21:10 crc kubenswrapper[4764]: I1001 17:21:10.724288 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:21:10 crc kubenswrapper[4764]: E1001 17:21:10.724967 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:21:11 crc kubenswrapper[4764]: I1001 17:21:11.792100 4764 generic.go:334] "Generic (PLEG): container finished" podID="972d86d6-ca6f-44b8-b75d-d8a453e11843" containerID="c2ab0a8d8731c1ede0182636a5ea166c683951461fe084b8729aab8b99d6c98e" exitCode=0 Oct 01 17:21:11 crc kubenswrapper[4764]: I1001 17:21:11.792209 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v59zf/crc-debug-lskbd" event={"ID":"972d86d6-ca6f-44b8-b75d-d8a453e11843","Type":"ContainerDied","Data":"c2ab0a8d8731c1ede0182636a5ea166c683951461fe084b8729aab8b99d6c98e"} Oct 01 17:21:12 crc kubenswrapper[4764]: I1001 17:21:12.932194 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/crc-debug-lskbd" Oct 01 17:21:12 crc kubenswrapper[4764]: I1001 17:21:12.965977 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v59zf/crc-debug-lskbd"] Oct 01 17:21:12 crc kubenswrapper[4764]: I1001 17:21:12.975876 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v59zf/crc-debug-lskbd"] Oct 01 17:21:13 crc kubenswrapper[4764]: I1001 17:21:13.106245 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/972d86d6-ca6f-44b8-b75d-d8a453e11843-host\") pod \"972d86d6-ca6f-44b8-b75d-d8a453e11843\" (UID: \"972d86d6-ca6f-44b8-b75d-d8a453e11843\") " Oct 01 17:21:13 crc kubenswrapper[4764]: I1001 17:21:13.106379 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/972d86d6-ca6f-44b8-b75d-d8a453e11843-host" (OuterVolumeSpecName: "host") pod "972d86d6-ca6f-44b8-b75d-d8a453e11843" (UID: "972d86d6-ca6f-44b8-b75d-d8a453e11843"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 17:21:13 crc kubenswrapper[4764]: I1001 17:21:13.106519 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8c72\" (UniqueName: \"kubernetes.io/projected/972d86d6-ca6f-44b8-b75d-d8a453e11843-kube-api-access-w8c72\") pod \"972d86d6-ca6f-44b8-b75d-d8a453e11843\" (UID: \"972d86d6-ca6f-44b8-b75d-d8a453e11843\") " Oct 01 17:21:13 crc kubenswrapper[4764]: I1001 17:21:13.107011 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/972d86d6-ca6f-44b8-b75d-d8a453e11843-host\") on node \"crc\" DevicePath \"\"" Oct 01 17:21:13 crc kubenswrapper[4764]: I1001 17:21:13.131396 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/972d86d6-ca6f-44b8-b75d-d8a453e11843-kube-api-access-w8c72" (OuterVolumeSpecName: "kube-api-access-w8c72") pod "972d86d6-ca6f-44b8-b75d-d8a453e11843" (UID: "972d86d6-ca6f-44b8-b75d-d8a453e11843"). InnerVolumeSpecName "kube-api-access-w8c72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:21:13 crc kubenswrapper[4764]: I1001 17:21:13.209359 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8c72\" (UniqueName: \"kubernetes.io/projected/972d86d6-ca6f-44b8-b75d-d8a453e11843-kube-api-access-w8c72\") on node \"crc\" DevicePath \"\"" Oct 01 17:21:13 crc kubenswrapper[4764]: I1001 17:21:13.735346 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="972d86d6-ca6f-44b8-b75d-d8a453e11843" path="/var/lib/kubelet/pods/972d86d6-ca6f-44b8-b75d-d8a453e11843/volumes" Oct 01 17:21:13 crc kubenswrapper[4764]: I1001 17:21:13.820752 4764 scope.go:117] "RemoveContainer" containerID="c2ab0a8d8731c1ede0182636a5ea166c683951461fe084b8729aab8b99d6c98e" Oct 01 17:21:13 crc kubenswrapper[4764]: I1001 17:21:13.820948 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/crc-debug-lskbd" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.133357 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v59zf/crc-debug-58dtv"] Oct 01 17:21:14 crc kubenswrapper[4764]: E1001 17:21:14.133841 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1952fe9-1eeb-48df-a902-99ca6708f92d" containerName="registry-server" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.133859 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1952fe9-1eeb-48df-a902-99ca6708f92d" containerName="registry-server" Oct 01 17:21:14 crc kubenswrapper[4764]: E1001 17:21:14.133884 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972d86d6-ca6f-44b8-b75d-d8a453e11843" containerName="container-00" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.133893 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="972d86d6-ca6f-44b8-b75d-d8a453e11843" containerName="container-00" Oct 01 17:21:14 crc kubenswrapper[4764]: E1001 17:21:14.133913 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1952fe9-1eeb-48df-a902-99ca6708f92d" containerName="extract-content" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.133922 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1952fe9-1eeb-48df-a902-99ca6708f92d" containerName="extract-content" Oct 01 17:21:14 crc kubenswrapper[4764]: E1001 17:21:14.133940 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1952fe9-1eeb-48df-a902-99ca6708f92d" containerName="extract-utilities" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.133945 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1952fe9-1eeb-48df-a902-99ca6708f92d" containerName="extract-utilities" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.134152 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1952fe9-1eeb-48df-a902-99ca6708f92d" containerName="registry-server" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.134169 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="972d86d6-ca6f-44b8-b75d-d8a453e11843" containerName="container-00" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.135254 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/crc-debug-58dtv" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.137626 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v59zf"/"default-dockercfg-wr99z" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.263135 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32fb0f51-422f-415b-bf67-4274d4e6b09a-host\") pod \"crc-debug-58dtv\" (UID: \"32fb0f51-422f-415b-bf67-4274d4e6b09a\") " pod="openshift-must-gather-v59zf/crc-debug-58dtv" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.263193 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99rsv\" (UniqueName: \"kubernetes.io/projected/32fb0f51-422f-415b-bf67-4274d4e6b09a-kube-api-access-99rsv\") pod \"crc-debug-58dtv\" (UID: \"32fb0f51-422f-415b-bf67-4274d4e6b09a\") " pod="openshift-must-gather-v59zf/crc-debug-58dtv" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.365427 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32fb0f51-422f-415b-bf67-4274d4e6b09a-host\") pod \"crc-debug-58dtv\" (UID: \"32fb0f51-422f-415b-bf67-4274d4e6b09a\") " pod="openshift-must-gather-v59zf/crc-debug-58dtv" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.365482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99rsv\" (UniqueName: \"kubernetes.io/projected/32fb0f51-422f-415b-bf67-4274d4e6b09a-kube-api-access-99rsv\") pod \"crc-debug-58dtv\" (UID: \"32fb0f51-422f-415b-bf67-4274d4e6b09a\") " pod="openshift-must-gather-v59zf/crc-debug-58dtv" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.365590 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32fb0f51-422f-415b-bf67-4274d4e6b09a-host\") pod \"crc-debug-58dtv\" (UID: \"32fb0f51-422f-415b-bf67-4274d4e6b09a\") " pod="openshift-must-gather-v59zf/crc-debug-58dtv" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.382870 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99rsv\" (UniqueName: \"kubernetes.io/projected/32fb0f51-422f-415b-bf67-4274d4e6b09a-kube-api-access-99rsv\") pod \"crc-debug-58dtv\" (UID: \"32fb0f51-422f-415b-bf67-4274d4e6b09a\") " pod="openshift-must-gather-v59zf/crc-debug-58dtv" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.457838 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/crc-debug-58dtv" Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.832713 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v59zf/crc-debug-58dtv" event={"ID":"32fb0f51-422f-415b-bf67-4274d4e6b09a","Type":"ContainerStarted","Data":"4a1b769d08d1b59598a59e25a7263bb3afab4247cd839f10ec8fb84433157f8d"} Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.833235 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v59zf/crc-debug-58dtv" event={"ID":"32fb0f51-422f-415b-bf67-4274d4e6b09a","Type":"ContainerStarted","Data":"0013b267de11738774a192b286de4110ea592f9fe927761de7916486b84771c4"} Oct 01 17:21:14 crc kubenswrapper[4764]: I1001 17:21:14.855629 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v59zf/crc-debug-58dtv" podStartSLOduration=0.855607205 podStartE2EDuration="855.607205ms" podCreationTimestamp="2025-10-01 17:21:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 17:21:14.846807698 +0000 UTC m=+4737.846454603" watchObservedRunningTime="2025-10-01 17:21:14.855607205 +0000 UTC m=+4737.855254040" Oct 01 17:21:15 crc kubenswrapper[4764]: I1001 17:21:15.843369 4764 generic.go:334] "Generic (PLEG): container finished" podID="32fb0f51-422f-415b-bf67-4274d4e6b09a" containerID="4a1b769d08d1b59598a59e25a7263bb3afab4247cd839f10ec8fb84433157f8d" exitCode=0 Oct 01 17:21:15 crc kubenswrapper[4764]: I1001 17:21:15.843443 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v59zf/crc-debug-58dtv" event={"ID":"32fb0f51-422f-415b-bf67-4274d4e6b09a","Type":"ContainerDied","Data":"4a1b769d08d1b59598a59e25a7263bb3afab4247cd839f10ec8fb84433157f8d"} Oct 01 17:21:16 crc kubenswrapper[4764]: I1001 17:21:16.950735 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/crc-debug-58dtv" Oct 01 17:21:17 crc kubenswrapper[4764]: I1001 17:21:17.111129 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99rsv\" (UniqueName: \"kubernetes.io/projected/32fb0f51-422f-415b-bf67-4274d4e6b09a-kube-api-access-99rsv\") pod \"32fb0f51-422f-415b-bf67-4274d4e6b09a\" (UID: \"32fb0f51-422f-415b-bf67-4274d4e6b09a\") " Oct 01 17:21:17 crc kubenswrapper[4764]: I1001 17:21:17.111671 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32fb0f51-422f-415b-bf67-4274d4e6b09a-host\") pod \"32fb0f51-422f-415b-bf67-4274d4e6b09a\" (UID: \"32fb0f51-422f-415b-bf67-4274d4e6b09a\") " Oct 01 17:21:17 crc kubenswrapper[4764]: I1001 17:21:17.111724 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32fb0f51-422f-415b-bf67-4274d4e6b09a-host" (OuterVolumeSpecName: "host") pod "32fb0f51-422f-415b-bf67-4274d4e6b09a" (UID: "32fb0f51-422f-415b-bf67-4274d4e6b09a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 17:21:17 crc kubenswrapper[4764]: I1001 17:21:17.112435 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32fb0f51-422f-415b-bf67-4274d4e6b09a-host\") on node \"crc\" DevicePath \"\"" Oct 01 17:21:17 crc kubenswrapper[4764]: I1001 17:21:17.118273 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32fb0f51-422f-415b-bf67-4274d4e6b09a-kube-api-access-99rsv" (OuterVolumeSpecName: "kube-api-access-99rsv") pod "32fb0f51-422f-415b-bf67-4274d4e6b09a" (UID: "32fb0f51-422f-415b-bf67-4274d4e6b09a"). InnerVolumeSpecName "kube-api-access-99rsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:21:17 crc kubenswrapper[4764]: I1001 17:21:17.213802 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99rsv\" (UniqueName: \"kubernetes.io/projected/32fb0f51-422f-415b-bf67-4274d4e6b09a-kube-api-access-99rsv\") on node \"crc\" DevicePath \"\"" Oct 01 17:21:17 crc kubenswrapper[4764]: I1001 17:21:17.870537 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v59zf/crc-debug-58dtv" event={"ID":"32fb0f51-422f-415b-bf67-4274d4e6b09a","Type":"ContainerDied","Data":"0013b267de11738774a192b286de4110ea592f9fe927761de7916486b84771c4"} Oct 01 17:21:17 crc kubenswrapper[4764]: I1001 17:21:17.871105 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0013b267de11738774a192b286de4110ea592f9fe927761de7916486b84771c4" Oct 01 17:21:17 crc kubenswrapper[4764]: I1001 17:21:17.870605 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/crc-debug-58dtv" Oct 01 17:21:18 crc kubenswrapper[4764]: E1001 17:21:18.397694 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32fb0f51_422f_415b_bf67_4274d4e6b09a.slice\": RecentStats: unable to find data in memory cache]" Oct 01 17:21:24 crc kubenswrapper[4764]: I1001 17:21:24.667190 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v59zf/crc-debug-58dtv"] Oct 01 17:21:24 crc kubenswrapper[4764]: I1001 17:21:24.675826 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v59zf/crc-debug-58dtv"] Oct 01 17:21:25 crc kubenswrapper[4764]: I1001 17:21:25.722351 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:21:25 crc kubenswrapper[4764]: E1001 17:21:25.722795 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:21:25 crc kubenswrapper[4764]: I1001 17:21:25.734067 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32fb0f51-422f-415b-bf67-4274d4e6b09a" path="/var/lib/kubelet/pods/32fb0f51-422f-415b-bf67-4274d4e6b09a/volumes" Oct 01 17:21:26 crc kubenswrapper[4764]: I1001 17:21:26.627104 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v59zf/crc-debug-mw6rt"] Oct 01 17:21:26 crc kubenswrapper[4764]: E1001 17:21:26.627470 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32fb0f51-422f-415b-bf67-4274d4e6b09a" containerName="container-00" Oct 01 17:21:26 crc kubenswrapper[4764]: I1001 17:21:26.627485 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fb0f51-422f-415b-bf67-4274d4e6b09a" containerName="container-00" Oct 01 17:21:26 crc kubenswrapper[4764]: I1001 17:21:26.627697 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="32fb0f51-422f-415b-bf67-4274d4e6b09a" containerName="container-00" Oct 01 17:21:26 crc kubenswrapper[4764]: I1001 17:21:26.628365 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/crc-debug-mw6rt" Oct 01 17:21:26 crc kubenswrapper[4764]: I1001 17:21:26.630128 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v59zf"/"default-dockercfg-wr99z" Oct 01 17:21:26 crc kubenswrapper[4764]: I1001 17:21:26.790390 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll76k\" (UniqueName: \"kubernetes.io/projected/d5ed7e0d-2500-4c64-8ae1-578b8c9825cf-kube-api-access-ll76k\") pod \"crc-debug-mw6rt\" (UID: \"d5ed7e0d-2500-4c64-8ae1-578b8c9825cf\") " pod="openshift-must-gather-v59zf/crc-debug-mw6rt" Oct 01 17:21:26 crc kubenswrapper[4764]: I1001 17:21:26.790758 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5ed7e0d-2500-4c64-8ae1-578b8c9825cf-host\") pod \"crc-debug-mw6rt\" (UID: \"d5ed7e0d-2500-4c64-8ae1-578b8c9825cf\") " pod="openshift-must-gather-v59zf/crc-debug-mw6rt" Oct 01 17:21:26 crc kubenswrapper[4764]: I1001 17:21:26.892308 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll76k\" (UniqueName: \"kubernetes.io/projected/d5ed7e0d-2500-4c64-8ae1-578b8c9825cf-kube-api-access-ll76k\") pod \"crc-debug-mw6rt\" (UID: \"d5ed7e0d-2500-4c64-8ae1-578b8c9825cf\") " pod="openshift-must-gather-v59zf/crc-debug-mw6rt" Oct 01 17:21:26 crc kubenswrapper[4764]: I1001 17:21:26.892618 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5ed7e0d-2500-4c64-8ae1-578b8c9825cf-host\") pod \"crc-debug-mw6rt\" (UID: \"d5ed7e0d-2500-4c64-8ae1-578b8c9825cf\") " pod="openshift-must-gather-v59zf/crc-debug-mw6rt" Oct 01 17:21:26 crc kubenswrapper[4764]: I1001 17:21:26.893184 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5ed7e0d-2500-4c64-8ae1-578b8c9825cf-host\") pod \"crc-debug-mw6rt\" (UID: \"d5ed7e0d-2500-4c64-8ae1-578b8c9825cf\") " pod="openshift-must-gather-v59zf/crc-debug-mw6rt" Oct 01 17:21:27 crc kubenswrapper[4764]: I1001 17:21:27.074952 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll76k\" (UniqueName: \"kubernetes.io/projected/d5ed7e0d-2500-4c64-8ae1-578b8c9825cf-kube-api-access-ll76k\") pod \"crc-debug-mw6rt\" (UID: \"d5ed7e0d-2500-4c64-8ae1-578b8c9825cf\") " pod="openshift-must-gather-v59zf/crc-debug-mw6rt" Oct 01 17:21:27 crc kubenswrapper[4764]: I1001 17:21:27.248271 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/crc-debug-mw6rt" Oct 01 17:21:27 crc kubenswrapper[4764]: I1001 17:21:27.962515 4764 generic.go:334] "Generic (PLEG): container finished" podID="d5ed7e0d-2500-4c64-8ae1-578b8c9825cf" containerID="0d89fdbf83c2ad6b04ab4218e45197b59e672cd67c73d9eab090e38d18ae54bf" exitCode=0 Oct 01 17:21:27 crc kubenswrapper[4764]: I1001 17:21:27.962601 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v59zf/crc-debug-mw6rt" event={"ID":"d5ed7e0d-2500-4c64-8ae1-578b8c9825cf","Type":"ContainerDied","Data":"0d89fdbf83c2ad6b04ab4218e45197b59e672cd67c73d9eab090e38d18ae54bf"} Oct 01 17:21:27 crc kubenswrapper[4764]: I1001 17:21:27.962967 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v59zf/crc-debug-mw6rt" event={"ID":"d5ed7e0d-2500-4c64-8ae1-578b8c9825cf","Type":"ContainerStarted","Data":"28d84aafca3a1393a30d26437818f9e986ec40f251fa3241b809c4bce9d1f72d"} Oct 01 17:21:28 crc kubenswrapper[4764]: I1001 17:21:28.003133 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v59zf/crc-debug-mw6rt"] Oct 01 17:21:28 crc kubenswrapper[4764]: I1001 17:21:28.012272 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v59zf/crc-debug-mw6rt"] Oct 01 17:21:28 crc kubenswrapper[4764]: E1001 17:21:28.659435 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32fb0f51_422f_415b_bf67_4274d4e6b09a.slice\": RecentStats: unable to find data in memory cache]" Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.089168 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/crc-debug-mw6rt" Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.233518 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5ed7e0d-2500-4c64-8ae1-578b8c9825cf-host\") pod \"d5ed7e0d-2500-4c64-8ae1-578b8c9825cf\" (UID: \"d5ed7e0d-2500-4c64-8ae1-578b8c9825cf\") " Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.233592 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll76k\" (UniqueName: \"kubernetes.io/projected/d5ed7e0d-2500-4c64-8ae1-578b8c9825cf-kube-api-access-ll76k\") pod \"d5ed7e0d-2500-4c64-8ae1-578b8c9825cf\" (UID: \"d5ed7e0d-2500-4c64-8ae1-578b8c9825cf\") " Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.233630 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5ed7e0d-2500-4c64-8ae1-578b8c9825cf-host" (OuterVolumeSpecName: "host") pod "d5ed7e0d-2500-4c64-8ae1-578b8c9825cf" (UID: "d5ed7e0d-2500-4c64-8ae1-578b8c9825cf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.234182 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5ed7e0d-2500-4c64-8ae1-578b8c9825cf-host\") on node \"crc\" DevicePath \"\"" Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.239438 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ed7e0d-2500-4c64-8ae1-578b8c9825cf-kube-api-access-ll76k" (OuterVolumeSpecName: "kube-api-access-ll76k") pod "d5ed7e0d-2500-4c64-8ae1-578b8c9825cf" (UID: "d5ed7e0d-2500-4c64-8ae1-578b8c9825cf"). InnerVolumeSpecName "kube-api-access-ll76k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.337176 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll76k\" (UniqueName: \"kubernetes.io/projected/d5ed7e0d-2500-4c64-8ae1-578b8c9825cf-kube-api-access-ll76k\") on node \"crc\" DevicePath \"\"" Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.616318 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/util/0.log" Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.733507 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ed7e0d-2500-4c64-8ae1-578b8c9825cf" path="/var/lib/kubelet/pods/d5ed7e0d-2500-4c64-8ae1-578b8c9825cf/volumes" Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.866237 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/pull/0.log" Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.876713 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/pull/0.log" Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.879477 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/util/0.log" Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.981921 4764 scope.go:117] "RemoveContainer" containerID="0d89fdbf83c2ad6b04ab4218e45197b59e672cd67c73d9eab090e38d18ae54bf" Oct 01 17:21:29 crc kubenswrapper[4764]: I1001 17:21:29.982015 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/crc-debug-mw6rt" Oct 01 17:21:30 crc kubenswrapper[4764]: I1001 17:21:30.046671 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/pull/0.log" Oct 01 17:21:30 crc kubenswrapper[4764]: I1001 17:21:30.083082 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/extract/0.log" Oct 01 17:21:30 crc kubenswrapper[4764]: I1001 17:21:30.084747 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4de61f7ad959a632db6380485e1ba427bfa16cd4213eaedfa8bf8d1fe02vqg5_9944bc89-7591-4fe0-81a2-41dba5a75f37/util/0.log" Oct 01 17:21:30 crc kubenswrapper[4764]: I1001 17:21:30.218102 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-4724b_af86c8bd-6b9f-4cf1-8ffc-d441a90f25fd/kube-rbac-proxy/0.log" Oct 01 17:21:30 crc kubenswrapper[4764]: I1001 17:21:30.304951 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-4724b_af86c8bd-6b9f-4cf1-8ffc-d441a90f25fd/manager/0.log" Oct 01 17:21:30 crc kubenswrapper[4764]: I1001 17:21:30.357442 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-lrzn4_74ebea6a-ca87-4ccb-ab25-3c4899c04d39/kube-rbac-proxy/0.log" Oct 01 17:21:30 crc kubenswrapper[4764]: I1001 17:21:30.499287 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-lrzn4_74ebea6a-ca87-4ccb-ab25-3c4899c04d39/manager/0.log" Oct 01 17:21:30 crc kubenswrapper[4764]: I1001 17:21:30.520170 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-27n67_0fe3c02d-9e92-4628-8d94-6797d56fe480/kube-rbac-proxy/0.log" Oct 01 17:21:30 crc kubenswrapper[4764]: I1001 17:21:30.572463 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-27n67_0fe3c02d-9e92-4628-8d94-6797d56fe480/manager/0.log" Oct 01 17:21:30 crc kubenswrapper[4764]: I1001 17:21:30.717137 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-vnmvt_0fff9cd4-9690-4a70-a578-0eadbcbb47d6/kube-rbac-proxy/0.log" Oct 01 17:21:30 crc kubenswrapper[4764]: I1001 17:21:30.866891 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-vnmvt_0fff9cd4-9690-4a70-a578-0eadbcbb47d6/manager/0.log" Oct 01 17:21:30 crc kubenswrapper[4764]: I1001 17:21:30.946695 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-9vv9s_9973c37b-d58f-48b0-8c1e-707576e2cb09/manager/0.log" Oct 01 17:21:30 crc kubenswrapper[4764]: I1001 17:21:30.956472 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-9vv9s_9973c37b-d58f-48b0-8c1e-707576e2cb09/kube-rbac-proxy/0.log" Oct 01 17:21:31 crc kubenswrapper[4764]: I1001 17:21:31.045082 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-g528r_323e4260-1016-4601-a8c1-f75641230fdb/kube-rbac-proxy/0.log" Oct 01 17:21:31 crc kubenswrapper[4764]: I1001 17:21:31.134208 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-g528r_323e4260-1016-4601-a8c1-f75641230fdb/manager/0.log" Oct 01 17:21:31 crc kubenswrapper[4764]: I1001 17:21:31.213272 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-bwj7p_8c17b6bf-0c13-491a-977a-95566d56d7c4/kube-rbac-proxy/0.log" Oct 01 17:21:31 crc kubenswrapper[4764]: I1001 17:21:31.396090 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-62xn4_c535869c-c448-4bea-944d-fce55ddd334c/kube-rbac-proxy/0.log" Oct 01 17:21:31 crc kubenswrapper[4764]: I1001 17:21:31.407639 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-62xn4_c535869c-c448-4bea-944d-fce55ddd334c/manager/0.log" Oct 01 17:21:31 crc kubenswrapper[4764]: I1001 17:21:31.537827 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-bwj7p_8c17b6bf-0c13-491a-977a-95566d56d7c4/manager/0.log" Oct 01 17:21:31 crc kubenswrapper[4764]: I1001 17:21:31.580491 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-fg9hc_67c0305d-d391-4892-869d-f5702a69cc45/kube-rbac-proxy/0.log" Oct 01 17:21:31 crc kubenswrapper[4764]: I1001 17:21:31.675182 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-fg9hc_67c0305d-d391-4892-869d-f5702a69cc45/manager/0.log" Oct 01 17:21:31 crc kubenswrapper[4764]: I1001 17:21:31.790386 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b67755477-8xdpz_db117cfb-3e46-4428-93b1-44a66101c57d/kube-rbac-proxy/0.log" Oct 01 17:21:31 crc kubenswrapper[4764]: I1001 17:21:31.815325 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b67755477-8xdpz_db117cfb-3e46-4428-93b1-44a66101c57d/manager/0.log" Oct 01 17:21:31 crc kubenswrapper[4764]: I1001 17:21:31.930331 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-lms99_e4b31c01-ec06-434e-af2a-228a1ee7ec19/kube-rbac-proxy/0.log" Oct 01 17:21:32 crc kubenswrapper[4764]: I1001 17:21:32.004913 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-lms99_e4b31c01-ec06-434e-af2a-228a1ee7ec19/manager/0.log" Oct 01 17:21:32 crc kubenswrapper[4764]: I1001 17:21:32.058755 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-z9988_53b1bb68-341f-4635-8339-ff10c9b08dee/kube-rbac-proxy/0.log" Oct 01 17:21:32 crc kubenswrapper[4764]: I1001 17:21:32.181828 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-z9988_53b1bb68-341f-4635-8339-ff10c9b08dee/manager/0.log" Oct 01 17:21:32 crc kubenswrapper[4764]: I1001 17:21:32.234408 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-59r2r_394ffa3b-3cd6-4deb-a436-624fa75155a2/kube-rbac-proxy/0.log" Oct 01 17:21:32 crc kubenswrapper[4764]: I1001 17:21:32.395102 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-59r2r_394ffa3b-3cd6-4deb-a436-624fa75155a2/manager/0.log" Oct 01 17:21:32 crc kubenswrapper[4764]: I1001 17:21:32.432616 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-vnlvl_2050b8cd-91c1-4962-b346-bbfa5c4e652e/kube-rbac-proxy/0.log" Oct 01 17:21:32 crc kubenswrapper[4764]: I1001 17:21:32.441412 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-vnlvl_2050b8cd-91c1-4962-b346-bbfa5c4e652e/manager/0.log" Oct 01 17:21:32 crc kubenswrapper[4764]: I1001 17:21:32.612767 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6_ba9b6db9-115e-4760-aef3-107976da810e/kube-rbac-proxy/0.log" Oct 01 17:21:32 crc kubenswrapper[4764]: I1001 17:21:32.669365 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cnj9b6_ba9b6db9-115e-4760-aef3-107976da810e/manager/0.log" Oct 01 17:21:32 crc kubenswrapper[4764]: I1001 17:21:32.755468 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c86467d95-tnl8h_0775c44c-131f-4a9c-89d5-bd724765e310/kube-rbac-proxy/0.log" Oct 01 17:21:32 crc kubenswrapper[4764]: I1001 17:21:32.924564 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-58f6bc99f-xsx4k_79ca6c6a-0b9f-4122-87ff-4eeb56046125/kube-rbac-proxy/0.log" Oct 01 17:21:33 crc kubenswrapper[4764]: I1001 17:21:33.102730 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-58f6bc99f-xsx4k_79ca6c6a-0b9f-4122-87ff-4eeb56046125/operator/0.log" Oct 01 17:21:33 crc kubenswrapper[4764]: I1001 17:21:33.148886 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-275hc_17f0747f-d044-4d4f-bdad-937743cfb537/registry-server/0.log" Oct 01 17:21:33 crc kubenswrapper[4764]: I1001 17:21:33.451345 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-rkr2b_33cb2692-6fcf-4af5-bf43-697a4a740c19/manager/0.log" Oct 01 17:21:33 crc kubenswrapper[4764]: I1001 17:21:33.480270 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-rkr2b_33cb2692-6fcf-4af5-bf43-697a4a740c19/kube-rbac-proxy/0.log" Oct 01 17:21:33 crc kubenswrapper[4764]: I1001 17:21:33.694163 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-jlwbd_d591342f-60e7-48db-9073-b2d6e9fe6992/kube-rbac-proxy/0.log" Oct 01 17:21:33 crc kubenswrapper[4764]: I1001 17:21:33.776804 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-jlwbd_d591342f-60e7-48db-9073-b2d6e9fe6992/manager/0.log" Oct 01 17:21:34 crc kubenswrapper[4764]: I1001 17:21:34.033542 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-qn5hp_9bb8b56a-c568-4ea4-985e-a80d49b61197/operator/0.log" Oct 01 17:21:34 crc kubenswrapper[4764]: I1001 17:21:34.117387 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-vbgxb_705820be-248f-49fb-9ace-0f333674985a/kube-rbac-proxy/0.log" Oct 01 17:21:34 crc kubenswrapper[4764]: I1001 17:21:34.125524 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-vbgxb_705820be-248f-49fb-9ace-0f333674985a/manager/0.log" Oct 01 17:21:34 crc kubenswrapper[4764]: I1001 17:21:34.133809 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c86467d95-tnl8h_0775c44c-131f-4a9c-89d5-bd724765e310/manager/0.log" Oct 01 17:21:34 crc kubenswrapper[4764]: I1001 17:21:34.294451 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-8wqsk_10d8bbe9-a54f-489a-8fdc-a6acf5b6a46b/kube-rbac-proxy/0.log" Oct 01 17:21:34 crc kubenswrapper[4764]: I1001 17:21:34.345352 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-q6nwl_e55f7c89-8011-437e-bcbc-b19ae9e25acd/kube-rbac-proxy/0.log" Oct 01 17:21:34 crc kubenswrapper[4764]: I1001 17:21:34.415954 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-8wqsk_10d8bbe9-a54f-489a-8fdc-a6acf5b6a46b/manager/0.log" Oct 01 17:21:34 crc kubenswrapper[4764]: I1001 17:21:34.490561 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-q6nwl_e55f7c89-8011-437e-bcbc-b19ae9e25acd/manager/0.log" Oct 01 17:21:34 crc kubenswrapper[4764]: I1001 17:21:34.512798 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-f5nmv_29eab2f9-7ef6-4cc6-9f45-af32a4071a5d/kube-rbac-proxy/0.log" Oct 01 17:21:34 crc kubenswrapper[4764]: I1001 17:21:34.578702 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-f5nmv_29eab2f9-7ef6-4cc6-9f45-af32a4071a5d/manager/0.log" Oct 01 17:21:38 crc kubenswrapper[4764]: E1001 17:21:38.925853 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32fb0f51_422f_415b_bf67_4274d4e6b09a.slice\": RecentStats: unable to find data in memory cache]" Oct 01 17:21:40 crc kubenswrapper[4764]: I1001 17:21:40.721632 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:21:40 crc kubenswrapper[4764]: E1001 17:21:40.722729 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:21:49 crc kubenswrapper[4764]: E1001 17:21:49.185672 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32fb0f51_422f_415b_bf67_4274d4e6b09a.slice\": RecentStats: unable to find data in memory cache]" Oct 01 17:21:50 crc kubenswrapper[4764]: I1001 17:21:50.421225 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-lntzx_eccae5ea-5d95-4d65-97cd-9d8ee4db20bc/control-plane-machine-set-operator/0.log" Oct 01 17:21:50 crc kubenswrapper[4764]: I1001 17:21:50.603655 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lt4t4_1e81f7ca-2bc8-4d14-a101-e73361300228/kube-rbac-proxy/0.log" Oct 01 17:21:50 crc kubenswrapper[4764]: I1001 17:21:50.647300 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lt4t4_1e81f7ca-2bc8-4d14-a101-e73361300228/machine-api-operator/0.log" Oct 01 17:21:55 crc kubenswrapper[4764]: I1001 17:21:55.721706 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:21:55 crc kubenswrapper[4764]: E1001 17:21:55.722336 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:21:59 crc kubenswrapper[4764]: E1001 17:21:59.437119 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32fb0f51_422f_415b_bf67_4274d4e6b09a.slice\": RecentStats: unable to find data in memory cache]" Oct 01 17:22:02 crc kubenswrapper[4764]: I1001 17:22:02.842808 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-xl9g4_b22968b8-7419-48e2-9fab-a54611dcecad/cert-manager-controller/0.log" Oct 01 17:22:03 crc kubenswrapper[4764]: I1001 17:22:03.044153 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-q78nt_0e72f60f-a975-4370-877f-5d5ba3c7c0b3/cert-manager-cainjector/0.log" Oct 01 17:22:03 crc kubenswrapper[4764]: I1001 17:22:03.097869 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-pxbtx_9eb7dd6e-2f03-4db5-9564-d87513d69d6b/cert-manager-webhook/0.log" Oct 01 17:22:08 crc kubenswrapper[4764]: I1001 17:22:08.721764 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:22:08 crc kubenswrapper[4764]: E1001 17:22:08.722544 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:22:09 crc kubenswrapper[4764]: E1001 17:22:09.705875 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32fb0f51_422f_415b_bf67_4274d4e6b09a.slice\": RecentStats: unable to find data in memory cache]" Oct 01 17:22:15 crc kubenswrapper[4764]: I1001 17:22:15.380518 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-t66ns_5b55cc57-324e-4f77-aa9f-d655abd399b4/nmstate-console-plugin/0.log" Oct 01 17:22:15 crc kubenswrapper[4764]: I1001 17:22:15.758400 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-snhkz_b82401c8-962a-451a-954e-2f603fe91129/nmstate-handler/0.log" Oct 01 17:22:15 crc kubenswrapper[4764]: I1001 17:22:15.780995 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-dg2fb_e8f3e872-b4e9-4b58-95aa-f63812824933/kube-rbac-proxy/0.log" Oct 01 17:22:15 crc kubenswrapper[4764]: I1001 17:22:15.794748 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-dg2fb_e8f3e872-b4e9-4b58-95aa-f63812824933/nmstate-metrics/0.log" Oct 01 17:22:15 crc kubenswrapper[4764]: I1001 17:22:15.987553 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-lfzc4_dd999924-e56c-45fa-8214-cec275174611/nmstate-operator/0.log" Oct 01 17:22:16 crc kubenswrapper[4764]: I1001 17:22:16.012650 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-bmnmm_2f2de87a-678a-4de3-8fec-9b695f301201/nmstate-webhook/0.log" Oct 01 17:22:20 crc kubenswrapper[4764]: I1001 17:22:20.722139 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:22:20 crc kubenswrapper[4764]: E1001 17:22:20.722888 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:22:30 crc kubenswrapper[4764]: I1001 17:22:30.237236 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-mxkb8_1e96a209-b683-4060-8c9f-b7b1ae8c89b0/kube-rbac-proxy/0.log" Oct 01 17:22:30 crc kubenswrapper[4764]: I1001 17:22:30.265377 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-mxkb8_1e96a209-b683-4060-8c9f-b7b1ae8c89b0/controller/0.log" Oct 01 17:22:30 crc kubenswrapper[4764]: I1001 17:22:30.510584 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-frr-files/0.log" Oct 01 17:22:30 crc kubenswrapper[4764]: I1001 17:22:30.645627 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-frr-files/0.log" Oct 01 17:22:30 crc kubenswrapper[4764]: I1001 17:22:30.659617 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-reloader/0.log" Oct 01 17:22:30 crc kubenswrapper[4764]: I1001 17:22:30.683212 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-metrics/0.log" Oct 01 17:22:30 crc kubenswrapper[4764]: I1001 17:22:30.717796 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-reloader/0.log" Oct 01 17:22:30 crc kubenswrapper[4764]: I1001 17:22:30.865632 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-frr-files/0.log" Oct 01 17:22:30 crc kubenswrapper[4764]: I1001 17:22:30.870312 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-metrics/0.log" Oct 01 17:22:30 crc kubenswrapper[4764]: I1001 17:22:30.886825 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-reloader/0.log" Oct 01 17:22:30 crc kubenswrapper[4764]: I1001 17:22:30.931242 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-metrics/0.log" Oct 01 17:22:31 crc kubenswrapper[4764]: I1001 17:22:31.082629 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-metrics/0.log" Oct 01 17:22:31 crc kubenswrapper[4764]: I1001 17:22:31.082629 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-frr-files/0.log" Oct 01 17:22:31 crc kubenswrapper[4764]: I1001 17:22:31.111578 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/cp-reloader/0.log" Oct 01 17:22:31 crc kubenswrapper[4764]: I1001 17:22:31.170693 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/controller/0.log" Oct 01 17:22:31 crc kubenswrapper[4764]: I1001 17:22:31.292572 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/frr-metrics/0.log" Oct 01 17:22:31 crc kubenswrapper[4764]: I1001 17:22:31.302101 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/kube-rbac-proxy/0.log" Oct 01 17:22:31 crc kubenswrapper[4764]: I1001 17:22:31.427689 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/kube-rbac-proxy-frr/0.log" Oct 01 17:22:31 crc kubenswrapper[4764]: I1001 17:22:31.565355 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/reloader/0.log" Oct 01 17:22:31 crc kubenswrapper[4764]: I1001 17:22:31.646991 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-bhdff_247af8f1-5e4b-4e17-9d31-055bdce2a1d6/frr-k8s-webhook-server/0.log" Oct 01 17:22:31 crc kubenswrapper[4764]: I1001 17:22:31.874583 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-845477fbc7-z8qk5_3b485840-2253-4eb1-888b-5e16d76a3a3d/manager/0.log" Oct 01 17:22:31 crc kubenswrapper[4764]: I1001 17:22:31.880513 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k2274_d7c7ca03-94fe-4d3b-914b-669bfd41d526/frr/0.log" Oct 01 17:22:32 crc kubenswrapper[4764]: I1001 17:22:32.023720 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-755d79df97-5b9ck_54e1ea89-8cd7-4a1c-a1b9-e20f321198f7/webhook-server/0.log" Oct 01 17:22:32 crc kubenswrapper[4764]: I1001 17:22:32.144915 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zm88x_0bf73b76-19c9-4264-abe9-80d24dcf6ee6/kube-rbac-proxy/0.log" Oct 01 17:22:32 crc kubenswrapper[4764]: I1001 17:22:32.246919 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zm88x_0bf73b76-19c9-4264-abe9-80d24dcf6ee6/speaker/0.log" Oct 01 17:22:33 crc kubenswrapper[4764]: I1001 17:22:33.722689 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:22:33 crc kubenswrapper[4764]: E1001 17:22:33.723356 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:22:44 crc kubenswrapper[4764]: I1001 17:22:44.723148 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:22:44 crc kubenswrapper[4764]: E1001 17:22:44.723999 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:22:45 crc kubenswrapper[4764]: I1001 17:22:45.955553 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/util/0.log" Oct 01 17:22:46 crc kubenswrapper[4764]: I1001 17:22:46.159817 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/pull/0.log" Oct 01 17:22:46 crc kubenswrapper[4764]: I1001 17:22:46.168871 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/util/0.log" Oct 01 17:22:46 crc kubenswrapper[4764]: I1001 17:22:46.205543 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/pull/0.log" Oct 01 17:22:46 crc kubenswrapper[4764]: I1001 17:22:46.355839 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/extract/0.log" Oct 01 17:22:46 crc kubenswrapper[4764]: I1001 17:22:46.361677 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/util/0.log" Oct 01 17:22:46 crc kubenswrapper[4764]: I1001 17:22:46.375492 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2kxj22_4d76e759-050c-4e98-b79c-6eb25431c21e/pull/0.log" Oct 01 17:22:46 crc kubenswrapper[4764]: I1001 17:22:46.532660 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/util/0.log" Oct 01 17:22:46 crc kubenswrapper[4764]: I1001 17:22:46.681412 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/util/0.log" Oct 01 17:22:46 crc kubenswrapper[4764]: I1001 17:22:46.736898 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/pull/0.log" Oct 01 17:22:46 crc kubenswrapper[4764]: I1001 17:22:46.764406 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/pull/0.log" Oct 01 17:22:46 crc kubenswrapper[4764]: I1001 17:22:46.943674 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/util/0.log" Oct 01 17:22:46 crc kubenswrapper[4764]: I1001 17:22:46.950841 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/pull/0.log" Oct 01 17:22:46 crc kubenswrapper[4764]: I1001 17:22:46.953568 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcsh9nw_da3eb657-76ae-4f69-9c49-51a7dfa7f054/extract/0.log" Oct 01 17:22:47 crc kubenswrapper[4764]: I1001 17:22:47.150205 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/extract-utilities/0.log" Oct 01 17:22:47 crc kubenswrapper[4764]: I1001 17:22:47.320516 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/extract-content/0.log" Oct 01 17:22:47 crc kubenswrapper[4764]: I1001 17:22:47.365893 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/extract-content/0.log" Oct 01 17:22:47 crc kubenswrapper[4764]: I1001 17:22:47.366197 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/extract-utilities/0.log" Oct 01 17:22:47 crc kubenswrapper[4764]: I1001 17:22:47.544759 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/extract-content/0.log" Oct 01 17:22:47 crc kubenswrapper[4764]: I1001 17:22:47.624038 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/extract-utilities/0.log" Oct 01 17:22:47 crc kubenswrapper[4764]: I1001 17:22:47.827292 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/extract-utilities/0.log" Oct 01 17:22:47 crc kubenswrapper[4764]: I1001 17:22:47.960686 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/extract-utilities/0.log" Oct 01 17:22:48 crc kubenswrapper[4764]: I1001 17:22:48.036701 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-k6866_1709385e-fe6c-443e-a437-ceded08bde5b/registry-server/0.log" Oct 01 17:22:48 crc kubenswrapper[4764]: I1001 17:22:48.160524 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/extract-content/0.log" Oct 01 17:22:48 crc kubenswrapper[4764]: I1001 17:22:48.261271 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/extract-content/0.log" Oct 01 17:22:48 crc kubenswrapper[4764]: I1001 17:22:48.741145 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/extract-content/0.log" Oct 01 17:22:48 crc kubenswrapper[4764]: I1001 17:22:48.804925 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/extract-utilities/0.log" Oct 01 17:22:49 crc kubenswrapper[4764]: I1001 17:22:49.006760 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/util/0.log" Oct 01 17:22:49 crc kubenswrapper[4764]: I1001 17:22:49.239537 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/util/0.log" Oct 01 17:22:49 crc kubenswrapper[4764]: I1001 17:22:49.271508 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/pull/0.log" Oct 01 17:22:49 crc kubenswrapper[4764]: I1001 17:22:49.319829 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/pull/0.log" Oct 01 17:22:49 crc kubenswrapper[4764]: I1001 17:22:49.503308 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/pull/0.log" Oct 01 17:22:49 crc kubenswrapper[4764]: I1001 17:22:49.573302 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/util/0.log" Oct 01 17:22:49 crc kubenswrapper[4764]: I1001 17:22:49.608931 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgqg9_6db57107-7b8f-48e1-8887-32516632caf8/extract/0.log" Oct 01 17:22:49 crc kubenswrapper[4764]: I1001 17:22:49.615002 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-66v4b_1436d169-be0b-479d-8a35-084020e816a2/registry-server/0.log" Oct 01 17:22:49 crc kubenswrapper[4764]: I1001 17:22:49.722966 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/util/0.log" Oct 01 17:22:49 crc kubenswrapper[4764]: I1001 17:22:49.926307 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/util/0.log" Oct 01 17:22:49 crc kubenswrapper[4764]: I1001 17:22:49.943662 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/pull/0.log" Oct 01 17:22:49 crc kubenswrapper[4764]: I1001 17:22:49.945453 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/pull/0.log" Oct 01 17:22:50 crc kubenswrapper[4764]: I1001 17:22:50.651968 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/util/0.log" Oct 01 17:22:50 crc kubenswrapper[4764]: I1001 17:22:50.673852 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/pull/0.log" Oct 01 17:22:50 crc kubenswrapper[4764]: I1001 17:22:50.679406 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xdv29_176eab78-eb2e-4612-994f-d13d95e6c80d/marketplace-operator/0.log" Oct 01 17:22:50 crc kubenswrapper[4764]: I1001 17:22:50.681573 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7f9ql_b3673e54-d98b-45f9-a98f-8ccb4e65ccf9/extract/0.log" Oct 01 17:22:50 crc kubenswrapper[4764]: I1001 17:22:50.874740 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/extract-utilities/0.log" Oct 01 17:22:51 crc kubenswrapper[4764]: I1001 17:22:51.012737 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/extract-content/0.log" Oct 01 17:22:51 crc kubenswrapper[4764]: I1001 17:22:51.045650 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/extract-content/0.log" Oct 01 17:22:51 crc kubenswrapper[4764]: I1001 17:22:51.045697 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/extract-utilities/0.log" Oct 01 17:22:51 crc kubenswrapper[4764]: I1001 17:22:51.221911 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/extract-utilities/0.log" Oct 01 17:22:51 crc kubenswrapper[4764]: I1001 17:22:51.229431 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/extract-content/0.log" Oct 01 17:22:51 crc kubenswrapper[4764]: I1001 17:22:51.262954 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qkzln_49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f/extract-utilities/0.log" Oct 01 17:22:51 crc kubenswrapper[4764]: I1001 17:22:51.414588 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-dvfhb_c7dad4be-a7b9-48f9-9259-fbaffbb22bd2/registry-server/0.log" Oct 01 17:22:51 crc kubenswrapper[4764]: I1001 17:22:51.507828 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qkzln_49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f/extract-utilities/0.log" Oct 01 17:22:51 crc kubenswrapper[4764]: I1001 17:22:51.513902 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qkzln_49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f/extract-content/0.log" Oct 01 17:22:51 crc kubenswrapper[4764]: I1001 17:22:51.533683 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qkzln_49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f/extract-content/0.log" Oct 01 17:22:51 crc kubenswrapper[4764]: I1001 17:22:51.696817 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qkzln_49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f/extract-content/0.log" Oct 01 17:22:51 crc kubenswrapper[4764]: I1001 17:22:51.736012 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qkzln_49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f/extract-utilities/0.log" Oct 01 17:22:51 crc kubenswrapper[4764]: I1001 17:22:51.844424 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qkzln_49dde2bb-b746-4e0b-98a7-93b9b2e8ea5f/registry-server/0.log" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.129061 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-966pd"] Oct 01 17:22:56 crc kubenswrapper[4764]: E1001 17:22:56.130168 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ed7e0d-2500-4c64-8ae1-578b8c9825cf" containerName="container-00" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.130185 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ed7e0d-2500-4c64-8ae1-578b8c9825cf" containerName="container-00" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.130423 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ed7e0d-2500-4c64-8ae1-578b8c9825cf" containerName="container-00" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.132260 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-966pd" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.150756 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-966pd"] Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.233999 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zw22\" (UniqueName: \"kubernetes.io/projected/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-kube-api-access-4zw22\") pod \"community-operators-966pd\" (UID: \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\") " pod="openshift-marketplace/community-operators-966pd" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.234138 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-catalog-content\") pod \"community-operators-966pd\" (UID: \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\") " pod="openshift-marketplace/community-operators-966pd" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.234304 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-utilities\") pod \"community-operators-966pd\" (UID: \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\") " pod="openshift-marketplace/community-operators-966pd" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.336483 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zw22\" (UniqueName: \"kubernetes.io/projected/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-kube-api-access-4zw22\") pod \"community-operators-966pd\" (UID: \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\") " pod="openshift-marketplace/community-operators-966pd" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.336571 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-catalog-content\") pod \"community-operators-966pd\" (UID: \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\") " pod="openshift-marketplace/community-operators-966pd" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.336700 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-utilities\") pod \"community-operators-966pd\" (UID: \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\") " pod="openshift-marketplace/community-operators-966pd" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.337351 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-catalog-content\") pod \"community-operators-966pd\" (UID: \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\") " pod="openshift-marketplace/community-operators-966pd" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.337422 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-utilities\") pod \"community-operators-966pd\" (UID: \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\") " pod="openshift-marketplace/community-operators-966pd" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.356452 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zw22\" (UniqueName: \"kubernetes.io/projected/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-kube-api-access-4zw22\") pod \"community-operators-966pd\" (UID: \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\") " pod="openshift-marketplace/community-operators-966pd" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.492023 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-966pd" Oct 01 17:22:56 crc kubenswrapper[4764]: I1001 17:22:56.722514 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:22:56 crc kubenswrapper[4764]: E1001 17:22:56.722986 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:22:57 crc kubenswrapper[4764]: I1001 17:22:57.040504 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-966pd"] Oct 01 17:22:57 crc kubenswrapper[4764]: I1001 17:22:57.772500 4764 generic.go:334] "Generic (PLEG): container finished" podID="40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" containerID="f56a20d333cded1d10a4db64b8dac47e8bcb3bb4e55e33ae18ff0fdd5ec23574" exitCode=0 Oct 01 17:22:57 crc kubenswrapper[4764]: I1001 17:22:57.772613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-966pd" event={"ID":"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79","Type":"ContainerDied","Data":"f56a20d333cded1d10a4db64b8dac47e8bcb3bb4e55e33ae18ff0fdd5ec23574"} Oct 01 17:22:57 crc kubenswrapper[4764]: I1001 17:22:57.772849 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-966pd" event={"ID":"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79","Type":"ContainerStarted","Data":"ad0725333ad5426ffe602477d1f20c1bcceabda7133d1be480fa03389200ede9"} Oct 01 17:22:59 crc kubenswrapper[4764]: I1001 17:22:59.794790 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-966pd" event={"ID":"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79","Type":"ContainerStarted","Data":"e2003f41fdbb4bfa335e6e7d39aed5ebc46a6d35c8598f80db43f00f4af0471e"} Oct 01 17:22:59 crc kubenswrapper[4764]: I1001 17:22:59.923193 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pxxhq"] Oct 01 17:22:59 crc kubenswrapper[4764]: I1001 17:22:59.925668 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:22:59 crc kubenswrapper[4764]: I1001 17:22:59.933540 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxxhq"] Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.013709 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbb3595-fa20-411b-9629-61db293c90c0-catalog-content\") pod \"redhat-marketplace-pxxhq\" (UID: \"adbb3595-fa20-411b-9629-61db293c90c0\") " pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.013846 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwzhp\" (UniqueName: \"kubernetes.io/projected/adbb3595-fa20-411b-9629-61db293c90c0-kube-api-access-lwzhp\") pod \"redhat-marketplace-pxxhq\" (UID: \"adbb3595-fa20-411b-9629-61db293c90c0\") " pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.013948 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbb3595-fa20-411b-9629-61db293c90c0-utilities\") pod \"redhat-marketplace-pxxhq\" (UID: \"adbb3595-fa20-411b-9629-61db293c90c0\") " pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.115674 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbb3595-fa20-411b-9629-61db293c90c0-utilities\") pod \"redhat-marketplace-pxxhq\" (UID: \"adbb3595-fa20-411b-9629-61db293c90c0\") " pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.115739 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbb3595-fa20-411b-9629-61db293c90c0-catalog-content\") pod \"redhat-marketplace-pxxhq\" (UID: \"adbb3595-fa20-411b-9629-61db293c90c0\") " pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.115862 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwzhp\" (UniqueName: \"kubernetes.io/projected/adbb3595-fa20-411b-9629-61db293c90c0-kube-api-access-lwzhp\") pod \"redhat-marketplace-pxxhq\" (UID: \"adbb3595-fa20-411b-9629-61db293c90c0\") " pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.116362 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbb3595-fa20-411b-9629-61db293c90c0-utilities\") pod \"redhat-marketplace-pxxhq\" (UID: \"adbb3595-fa20-411b-9629-61db293c90c0\") " pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.116391 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbb3595-fa20-411b-9629-61db293c90c0-catalog-content\") pod \"redhat-marketplace-pxxhq\" (UID: \"adbb3595-fa20-411b-9629-61db293c90c0\") " pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.148640 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwzhp\" (UniqueName: \"kubernetes.io/projected/adbb3595-fa20-411b-9629-61db293c90c0-kube-api-access-lwzhp\") pod \"redhat-marketplace-pxxhq\" (UID: \"adbb3595-fa20-411b-9629-61db293c90c0\") " pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.241701 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.783108 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxxhq"] Oct 01 17:23:00 crc kubenswrapper[4764]: W1001 17:23:00.788725 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadbb3595_fa20_411b_9629_61db293c90c0.slice/crio-f02e14c8830bc4b3d56c180d62d9558be1e1e07ef732a771f4f2bce968806615 WatchSource:0}: Error finding container f02e14c8830bc4b3d56c180d62d9558be1e1e07ef732a771f4f2bce968806615: Status 404 returned error can't find the container with id f02e14c8830bc4b3d56c180d62d9558be1e1e07ef732a771f4f2bce968806615 Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.805088 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxxhq" event={"ID":"adbb3595-fa20-411b-9629-61db293c90c0","Type":"ContainerStarted","Data":"f02e14c8830bc4b3d56c180d62d9558be1e1e07ef732a771f4f2bce968806615"} Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.807233 4764 generic.go:334] "Generic (PLEG): container finished" podID="40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" containerID="e2003f41fdbb4bfa335e6e7d39aed5ebc46a6d35c8598f80db43f00f4af0471e" exitCode=0 Oct 01 17:23:00 crc kubenswrapper[4764]: I1001 17:23:00.807268 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-966pd" event={"ID":"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79","Type":"ContainerDied","Data":"e2003f41fdbb4bfa335e6e7d39aed5ebc46a6d35c8598f80db43f00f4af0471e"} Oct 01 17:23:01 crc kubenswrapper[4764]: I1001 17:23:01.820437 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-966pd" event={"ID":"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79","Type":"ContainerStarted","Data":"8c23d6445f4362f5caada660b77fae2cbf9636f611b745a272b90c9d937f985b"} Oct 01 17:23:01 crc kubenswrapper[4764]: I1001 17:23:01.822175 4764 generic.go:334] "Generic (PLEG): container finished" podID="adbb3595-fa20-411b-9629-61db293c90c0" containerID="dc3fb377fe00c6a55e197819809aff49849597e0f78124cbec2ba6828bb2bfbe" exitCode=0 Oct 01 17:23:01 crc kubenswrapper[4764]: I1001 17:23:01.822223 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxxhq" event={"ID":"adbb3595-fa20-411b-9629-61db293c90c0","Type":"ContainerDied","Data":"dc3fb377fe00c6a55e197819809aff49849597e0f78124cbec2ba6828bb2bfbe"} Oct 01 17:23:01 crc kubenswrapper[4764]: I1001 17:23:01.847706 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-966pd" podStartSLOduration=2.240183275 podStartE2EDuration="5.847687337s" podCreationTimestamp="2025-10-01 17:22:56 +0000 UTC" firstStartedPulling="2025-10-01 17:22:57.774491049 +0000 UTC m=+4840.774137884" lastFinishedPulling="2025-10-01 17:23:01.381995111 +0000 UTC m=+4844.381641946" observedRunningTime="2025-10-01 17:23:01.838003558 +0000 UTC m=+4844.837650423" watchObservedRunningTime="2025-10-01 17:23:01.847687337 +0000 UTC m=+4844.847334192" Oct 01 17:23:03 crc kubenswrapper[4764]: I1001 17:23:03.846167 4764 generic.go:334] "Generic (PLEG): container finished" podID="adbb3595-fa20-411b-9629-61db293c90c0" containerID="6c26041d26174b82fa944fc32729c350ba88d0afa093399a371c83c275c1e0bc" exitCode=0 Oct 01 17:23:03 crc kubenswrapper[4764]: I1001 17:23:03.846243 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxxhq" event={"ID":"adbb3595-fa20-411b-9629-61db293c90c0","Type":"ContainerDied","Data":"6c26041d26174b82fa944fc32729c350ba88d0afa093399a371c83c275c1e0bc"} Oct 01 17:23:04 crc kubenswrapper[4764]: I1001 17:23:04.866545 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxxhq" event={"ID":"adbb3595-fa20-411b-9629-61db293c90c0","Type":"ContainerStarted","Data":"dd9ba72dde8712c2572b9efe57f2bd70d05dd43d6b3a258efe5009cac2ad1262"} Oct 01 17:23:04 crc kubenswrapper[4764]: I1001 17:23:04.890018 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pxxhq" podStartSLOduration=3.327168273 podStartE2EDuration="5.890001033s" podCreationTimestamp="2025-10-01 17:22:59 +0000 UTC" firstStartedPulling="2025-10-01 17:23:01.824055465 +0000 UTC m=+4844.823702300" lastFinishedPulling="2025-10-01 17:23:04.386888225 +0000 UTC m=+4847.386535060" observedRunningTime="2025-10-01 17:23:04.885067101 +0000 UTC m=+4847.884713936" watchObservedRunningTime="2025-10-01 17:23:04.890001033 +0000 UTC m=+4847.889647868" Oct 01 17:23:06 crc kubenswrapper[4764]: I1001 17:23:06.492584 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-966pd" Oct 01 17:23:06 crc kubenswrapper[4764]: I1001 17:23:06.492954 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-966pd" Oct 01 17:23:06 crc kubenswrapper[4764]: I1001 17:23:06.558016 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-966pd" Oct 01 17:23:06 crc kubenswrapper[4764]: I1001 17:23:06.932433 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-966pd" Oct 01 17:23:09 crc kubenswrapper[4764]: I1001 17:23:09.915871 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-966pd"] Oct 01 17:23:09 crc kubenswrapper[4764]: I1001 17:23:09.916398 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-966pd" podUID="40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" containerName="registry-server" containerID="cri-o://8c23d6445f4362f5caada660b77fae2cbf9636f611b745a272b90c9d937f985b" gracePeriod=2 Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.248345 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.248611 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.366704 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.505483 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-966pd" Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.648091 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-utilities\") pod \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\" (UID: \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\") " Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.648221 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zw22\" (UniqueName: \"kubernetes.io/projected/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-kube-api-access-4zw22\") pod \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\" (UID: \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\") " Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.648383 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-catalog-content\") pod \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\" (UID: \"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79\") " Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.649114 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-utilities" (OuterVolumeSpecName: "utilities") pod "40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" (UID: "40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.721581 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.721648 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" (UID: "40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:23:10 crc kubenswrapper[4764]: E1001 17:23:10.722909 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zf6qx_openshift-machine-config-operator(2068a381-c49b-41a4-bd0d-8c525f9b30d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.751373 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.751429 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.919521 4764 generic.go:334] "Generic (PLEG): container finished" podID="40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" containerID="8c23d6445f4362f5caada660b77fae2cbf9636f611b745a272b90c9d937f985b" exitCode=0 Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.919578 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-966pd" Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.919578 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-966pd" event={"ID":"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79","Type":"ContainerDied","Data":"8c23d6445f4362f5caada660b77fae2cbf9636f611b745a272b90c9d937f985b"} Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.919622 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-966pd" event={"ID":"40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79","Type":"ContainerDied","Data":"ad0725333ad5426ffe602477d1f20c1bcceabda7133d1be480fa03389200ede9"} Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.919642 4764 scope.go:117] "RemoveContainer" containerID="8c23d6445f4362f5caada660b77fae2cbf9636f611b745a272b90c9d937f985b" Oct 01 17:23:10 crc kubenswrapper[4764]: I1001 17:23:10.936403 4764 scope.go:117] "RemoveContainer" containerID="e2003f41fdbb4bfa335e6e7d39aed5ebc46a6d35c8598f80db43f00f4af0471e" Oct 01 17:23:11 crc kubenswrapper[4764]: I1001 17:23:11.101364 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-kube-api-access-4zw22" (OuterVolumeSpecName: "kube-api-access-4zw22") pod "40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" (UID: "40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79"). InnerVolumeSpecName "kube-api-access-4zw22". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:23:11 crc kubenswrapper[4764]: I1001 17:23:11.120034 4764 scope.go:117] "RemoveContainer" containerID="f56a20d333cded1d10a4db64b8dac47e8bcb3bb4e55e33ae18ff0fdd5ec23574" Oct 01 17:23:11 crc kubenswrapper[4764]: I1001 17:23:11.160438 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zw22\" (UniqueName: \"kubernetes.io/projected/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79-kube-api-access-4zw22\") on node \"crc\" DevicePath \"\"" Oct 01 17:23:11 crc kubenswrapper[4764]: I1001 17:23:11.243410 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:11 crc kubenswrapper[4764]: I1001 17:23:11.300766 4764 scope.go:117] "RemoveContainer" containerID="8c23d6445f4362f5caada660b77fae2cbf9636f611b745a272b90c9d937f985b" Oct 01 17:23:11 crc kubenswrapper[4764]: E1001 17:23:11.301312 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c23d6445f4362f5caada660b77fae2cbf9636f611b745a272b90c9d937f985b\": container with ID starting with 8c23d6445f4362f5caada660b77fae2cbf9636f611b745a272b90c9d937f985b not found: ID does not exist" containerID="8c23d6445f4362f5caada660b77fae2cbf9636f611b745a272b90c9d937f985b" Oct 01 17:23:11 crc kubenswrapper[4764]: I1001 17:23:11.301361 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c23d6445f4362f5caada660b77fae2cbf9636f611b745a272b90c9d937f985b"} err="failed to get container status \"8c23d6445f4362f5caada660b77fae2cbf9636f611b745a272b90c9d937f985b\": rpc error: code = NotFound desc = could not find container \"8c23d6445f4362f5caada660b77fae2cbf9636f611b745a272b90c9d937f985b\": container with ID starting with 8c23d6445f4362f5caada660b77fae2cbf9636f611b745a272b90c9d937f985b not found: ID does not exist" Oct 01 17:23:11 crc kubenswrapper[4764]: I1001 17:23:11.301392 4764 scope.go:117] "RemoveContainer" containerID="e2003f41fdbb4bfa335e6e7d39aed5ebc46a6d35c8598f80db43f00f4af0471e" Oct 01 17:23:11 crc kubenswrapper[4764]: E1001 17:23:11.308262 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2003f41fdbb4bfa335e6e7d39aed5ebc46a6d35c8598f80db43f00f4af0471e\": container with ID starting with e2003f41fdbb4bfa335e6e7d39aed5ebc46a6d35c8598f80db43f00f4af0471e not found: ID does not exist" containerID="e2003f41fdbb4bfa335e6e7d39aed5ebc46a6d35c8598f80db43f00f4af0471e" Oct 01 17:23:11 crc kubenswrapper[4764]: I1001 17:23:11.308318 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2003f41fdbb4bfa335e6e7d39aed5ebc46a6d35c8598f80db43f00f4af0471e"} err="failed to get container status \"e2003f41fdbb4bfa335e6e7d39aed5ebc46a6d35c8598f80db43f00f4af0471e\": rpc error: code = NotFound desc = could not find container \"e2003f41fdbb4bfa335e6e7d39aed5ebc46a6d35c8598f80db43f00f4af0471e\": container with ID starting with e2003f41fdbb4bfa335e6e7d39aed5ebc46a6d35c8598f80db43f00f4af0471e not found: ID does not exist" Oct 01 17:23:11 crc kubenswrapper[4764]: I1001 17:23:11.308345 4764 scope.go:117] "RemoveContainer" containerID="f56a20d333cded1d10a4db64b8dac47e8bcb3bb4e55e33ae18ff0fdd5ec23574" Oct 01 17:23:11 crc kubenswrapper[4764]: E1001 17:23:11.308857 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f56a20d333cded1d10a4db64b8dac47e8bcb3bb4e55e33ae18ff0fdd5ec23574\": container with ID starting with f56a20d333cded1d10a4db64b8dac47e8bcb3bb4e55e33ae18ff0fdd5ec23574 not found: ID does not exist" containerID="f56a20d333cded1d10a4db64b8dac47e8bcb3bb4e55e33ae18ff0fdd5ec23574" Oct 01 17:23:11 crc kubenswrapper[4764]: I1001 17:23:11.308910 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f56a20d333cded1d10a4db64b8dac47e8bcb3bb4e55e33ae18ff0fdd5ec23574"} err="failed to get container status \"f56a20d333cded1d10a4db64b8dac47e8bcb3bb4e55e33ae18ff0fdd5ec23574\": rpc error: code = NotFound desc = could not find container \"f56a20d333cded1d10a4db64b8dac47e8bcb3bb4e55e33ae18ff0fdd5ec23574\": container with ID starting with f56a20d333cded1d10a4db64b8dac47e8bcb3bb4e55e33ae18ff0fdd5ec23574 not found: ID does not exist" Oct 01 17:23:11 crc kubenswrapper[4764]: I1001 17:23:11.377232 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-966pd"] Oct 01 17:23:11 crc kubenswrapper[4764]: I1001 17:23:11.392785 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-966pd"] Oct 01 17:23:11 crc kubenswrapper[4764]: E1001 17:23:11.472477 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40eb4aa1_9ed6_402b_8bf3_3ff7da0e7a79.slice\": RecentStats: unable to find data in memory cache]" Oct 01 17:23:11 crc kubenswrapper[4764]: I1001 17:23:11.732521 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" path="/var/lib/kubelet/pods/40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79/volumes" Oct 01 17:23:12 crc kubenswrapper[4764]: I1001 17:23:12.722984 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxxhq"] Oct 01 17:23:12 crc kubenswrapper[4764]: I1001 17:23:12.938561 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pxxhq" podUID="adbb3595-fa20-411b-9629-61db293c90c0" containerName="registry-server" containerID="cri-o://dd9ba72dde8712c2572b9efe57f2bd70d05dd43d6b3a258efe5009cac2ad1262" gracePeriod=2 Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.463329 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.623229 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwzhp\" (UniqueName: \"kubernetes.io/projected/adbb3595-fa20-411b-9629-61db293c90c0-kube-api-access-lwzhp\") pod \"adbb3595-fa20-411b-9629-61db293c90c0\" (UID: \"adbb3595-fa20-411b-9629-61db293c90c0\") " Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.623561 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbb3595-fa20-411b-9629-61db293c90c0-catalog-content\") pod \"adbb3595-fa20-411b-9629-61db293c90c0\" (UID: \"adbb3595-fa20-411b-9629-61db293c90c0\") " Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.623619 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbb3595-fa20-411b-9629-61db293c90c0-utilities\") pod \"adbb3595-fa20-411b-9629-61db293c90c0\" (UID: \"adbb3595-fa20-411b-9629-61db293c90c0\") " Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.630708 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbb3595-fa20-411b-9629-61db293c90c0-utilities" (OuterVolumeSpecName: "utilities") pod "adbb3595-fa20-411b-9629-61db293c90c0" (UID: "adbb3595-fa20-411b-9629-61db293c90c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.632238 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adbb3595-fa20-411b-9629-61db293c90c0-kube-api-access-lwzhp" (OuterVolumeSpecName: "kube-api-access-lwzhp") pod "adbb3595-fa20-411b-9629-61db293c90c0" (UID: "adbb3595-fa20-411b-9629-61db293c90c0"). InnerVolumeSpecName "kube-api-access-lwzhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.640988 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbb3595-fa20-411b-9629-61db293c90c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adbb3595-fa20-411b-9629-61db293c90c0" (UID: "adbb3595-fa20-411b-9629-61db293c90c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.726072 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbb3595-fa20-411b-9629-61db293c90c0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.726122 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwzhp\" (UniqueName: \"kubernetes.io/projected/adbb3595-fa20-411b-9629-61db293c90c0-kube-api-access-lwzhp\") on node \"crc\" DevicePath \"\"" Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.726133 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbb3595-fa20-411b-9629-61db293c90c0-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.949067 4764 generic.go:334] "Generic (PLEG): container finished" podID="adbb3595-fa20-411b-9629-61db293c90c0" containerID="dd9ba72dde8712c2572b9efe57f2bd70d05dd43d6b3a258efe5009cac2ad1262" exitCode=0 Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.949119 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxxhq" event={"ID":"adbb3595-fa20-411b-9629-61db293c90c0","Type":"ContainerDied","Data":"dd9ba72dde8712c2572b9efe57f2bd70d05dd43d6b3a258efe5009cac2ad1262"} Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.949156 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxxhq" event={"ID":"adbb3595-fa20-411b-9629-61db293c90c0","Type":"ContainerDied","Data":"f02e14c8830bc4b3d56c180d62d9558be1e1e07ef732a771f4f2bce968806615"} Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.949123 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxxhq" Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.949180 4764 scope.go:117] "RemoveContainer" containerID="dd9ba72dde8712c2572b9efe57f2bd70d05dd43d6b3a258efe5009cac2ad1262" Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.974379 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxxhq"] Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.986148 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxxhq"] Oct 01 17:23:13 crc kubenswrapper[4764]: I1001 17:23:13.994186 4764 scope.go:117] "RemoveContainer" containerID="6c26041d26174b82fa944fc32729c350ba88d0afa093399a371c83c275c1e0bc" Oct 01 17:23:14 crc kubenswrapper[4764]: I1001 17:23:14.026750 4764 scope.go:117] "RemoveContainer" containerID="dc3fb377fe00c6a55e197819809aff49849597e0f78124cbec2ba6828bb2bfbe" Oct 01 17:23:14 crc kubenswrapper[4764]: I1001 17:23:14.078244 4764 scope.go:117] "RemoveContainer" containerID="dd9ba72dde8712c2572b9efe57f2bd70d05dd43d6b3a258efe5009cac2ad1262" Oct 01 17:23:14 crc kubenswrapper[4764]: E1001 17:23:14.079316 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd9ba72dde8712c2572b9efe57f2bd70d05dd43d6b3a258efe5009cac2ad1262\": container with ID starting with dd9ba72dde8712c2572b9efe57f2bd70d05dd43d6b3a258efe5009cac2ad1262 not found: ID does not exist" containerID="dd9ba72dde8712c2572b9efe57f2bd70d05dd43d6b3a258efe5009cac2ad1262" Oct 01 17:23:14 crc kubenswrapper[4764]: I1001 17:23:14.079354 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd9ba72dde8712c2572b9efe57f2bd70d05dd43d6b3a258efe5009cac2ad1262"} err="failed to get container status \"dd9ba72dde8712c2572b9efe57f2bd70d05dd43d6b3a258efe5009cac2ad1262\": rpc error: code = NotFound desc = could not find container \"dd9ba72dde8712c2572b9efe57f2bd70d05dd43d6b3a258efe5009cac2ad1262\": container with ID starting with dd9ba72dde8712c2572b9efe57f2bd70d05dd43d6b3a258efe5009cac2ad1262 not found: ID does not exist" Oct 01 17:23:14 crc kubenswrapper[4764]: I1001 17:23:14.079382 4764 scope.go:117] "RemoveContainer" containerID="6c26041d26174b82fa944fc32729c350ba88d0afa093399a371c83c275c1e0bc" Oct 01 17:23:14 crc kubenswrapper[4764]: E1001 17:23:14.079792 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c26041d26174b82fa944fc32729c350ba88d0afa093399a371c83c275c1e0bc\": container with ID starting with 6c26041d26174b82fa944fc32729c350ba88d0afa093399a371c83c275c1e0bc not found: ID does not exist" containerID="6c26041d26174b82fa944fc32729c350ba88d0afa093399a371c83c275c1e0bc" Oct 01 17:23:14 crc kubenswrapper[4764]: I1001 17:23:14.079824 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c26041d26174b82fa944fc32729c350ba88d0afa093399a371c83c275c1e0bc"} err="failed to get container status \"6c26041d26174b82fa944fc32729c350ba88d0afa093399a371c83c275c1e0bc\": rpc error: code = NotFound desc = could not find container \"6c26041d26174b82fa944fc32729c350ba88d0afa093399a371c83c275c1e0bc\": container with ID starting with 6c26041d26174b82fa944fc32729c350ba88d0afa093399a371c83c275c1e0bc not found: ID does not exist" Oct 01 17:23:14 crc kubenswrapper[4764]: I1001 17:23:14.079841 4764 scope.go:117] "RemoveContainer" containerID="dc3fb377fe00c6a55e197819809aff49849597e0f78124cbec2ba6828bb2bfbe" Oct 01 17:23:14 crc kubenswrapper[4764]: E1001 17:23:14.080175 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc3fb377fe00c6a55e197819809aff49849597e0f78124cbec2ba6828bb2bfbe\": container with ID starting with dc3fb377fe00c6a55e197819809aff49849597e0f78124cbec2ba6828bb2bfbe not found: ID does not exist" containerID="dc3fb377fe00c6a55e197819809aff49849597e0f78124cbec2ba6828bb2bfbe" Oct 01 17:23:14 crc kubenswrapper[4764]: I1001 17:23:14.080196 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc3fb377fe00c6a55e197819809aff49849597e0f78124cbec2ba6828bb2bfbe"} err="failed to get container status \"dc3fb377fe00c6a55e197819809aff49849597e0f78124cbec2ba6828bb2bfbe\": rpc error: code = NotFound desc = could not find container \"dc3fb377fe00c6a55e197819809aff49849597e0f78124cbec2ba6828bb2bfbe\": container with ID starting with dc3fb377fe00c6a55e197819809aff49849597e0f78124cbec2ba6828bb2bfbe not found: ID does not exist" Oct 01 17:23:15 crc kubenswrapper[4764]: I1001 17:23:15.735667 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adbb3595-fa20-411b-9629-61db293c90c0" path="/var/lib/kubelet/pods/adbb3595-fa20-411b-9629-61db293c90c0/volumes" Oct 01 17:23:25 crc kubenswrapper[4764]: I1001 17:23:25.721518 4764 scope.go:117] "RemoveContainer" containerID="99952d6939b2e5a2a2173d26169494ea16681db748a94b2cac3a17a486476f22" Oct 01 17:23:26 crc kubenswrapper[4764]: I1001 17:23:26.053523 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" event={"ID":"2068a381-c49b-41a4-bd0d-8c525f9b30d0","Type":"ContainerStarted","Data":"1074a573be6cf662d7fd86c538f233de6e0ff601948c3badd9324a22f9b331e1"} Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.239357 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v4scx"] Oct 01 17:24:11 crc kubenswrapper[4764]: E1001 17:24:11.240459 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbb3595-fa20-411b-9629-61db293c90c0" containerName="extract-content" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.240474 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbb3595-fa20-411b-9629-61db293c90c0" containerName="extract-content" Oct 01 17:24:11 crc kubenswrapper[4764]: E1001 17:24:11.240505 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" containerName="extract-utilities" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.240511 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" containerName="extract-utilities" Oct 01 17:24:11 crc kubenswrapper[4764]: E1001 17:24:11.240536 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" containerName="extract-content" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.240544 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" containerName="extract-content" Oct 01 17:24:11 crc kubenswrapper[4764]: E1001 17:24:11.240558 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbb3595-fa20-411b-9629-61db293c90c0" containerName="registry-server" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.240563 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbb3595-fa20-411b-9629-61db293c90c0" containerName="registry-server" Oct 01 17:24:11 crc kubenswrapper[4764]: E1001 17:24:11.240574 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbb3595-fa20-411b-9629-61db293c90c0" containerName="extract-utilities" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.240580 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbb3595-fa20-411b-9629-61db293c90c0" containerName="extract-utilities" Oct 01 17:24:11 crc kubenswrapper[4764]: E1001 17:24:11.240587 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" containerName="registry-server" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.240593 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" containerName="registry-server" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.240759 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="adbb3595-fa20-411b-9629-61db293c90c0" containerName="registry-server" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.240779 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="40eb4aa1-9ed6-402b-8bf3-3ff7da0e7a79" containerName="registry-server" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.242127 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.268396 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v4scx"] Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.368566 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f9bb48-d921-4735-8705-0db6606550dd-catalog-content\") pod \"certified-operators-v4scx\" (UID: \"67f9bb48-d921-4735-8705-0db6606550dd\") " pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.368655 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s67k2\" (UniqueName: \"kubernetes.io/projected/67f9bb48-d921-4735-8705-0db6606550dd-kube-api-access-s67k2\") pod \"certified-operators-v4scx\" (UID: \"67f9bb48-d921-4735-8705-0db6606550dd\") " pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.368714 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f9bb48-d921-4735-8705-0db6606550dd-utilities\") pod \"certified-operators-v4scx\" (UID: \"67f9bb48-d921-4735-8705-0db6606550dd\") " pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.470718 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f9bb48-d921-4735-8705-0db6606550dd-catalog-content\") pod \"certified-operators-v4scx\" (UID: \"67f9bb48-d921-4735-8705-0db6606550dd\") " pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.470834 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s67k2\" (UniqueName: \"kubernetes.io/projected/67f9bb48-d921-4735-8705-0db6606550dd-kube-api-access-s67k2\") pod \"certified-operators-v4scx\" (UID: \"67f9bb48-d921-4735-8705-0db6606550dd\") " pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.470884 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f9bb48-d921-4735-8705-0db6606550dd-utilities\") pod \"certified-operators-v4scx\" (UID: \"67f9bb48-d921-4735-8705-0db6606550dd\") " pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.471338 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f9bb48-d921-4735-8705-0db6606550dd-catalog-content\") pod \"certified-operators-v4scx\" (UID: \"67f9bb48-d921-4735-8705-0db6606550dd\") " pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.471417 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f9bb48-d921-4735-8705-0db6606550dd-utilities\") pod \"certified-operators-v4scx\" (UID: \"67f9bb48-d921-4735-8705-0db6606550dd\") " pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.491175 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s67k2\" (UniqueName: \"kubernetes.io/projected/67f9bb48-d921-4735-8705-0db6606550dd-kube-api-access-s67k2\") pod \"certified-operators-v4scx\" (UID: \"67f9bb48-d921-4735-8705-0db6606550dd\") " pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:11 crc kubenswrapper[4764]: I1001 17:24:11.567943 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:12 crc kubenswrapper[4764]: I1001 17:24:12.086487 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v4scx"] Oct 01 17:24:12 crc kubenswrapper[4764]: I1001 17:24:12.485279 4764 generic.go:334] "Generic (PLEG): container finished" podID="67f9bb48-d921-4735-8705-0db6606550dd" containerID="d80405a5d7af6ce2fe824d1f423eb632afde124fbc1e7f35148cac29e043318b" exitCode=0 Oct 01 17:24:12 crc kubenswrapper[4764]: I1001 17:24:12.485330 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4scx" event={"ID":"67f9bb48-d921-4735-8705-0db6606550dd","Type":"ContainerDied","Data":"d80405a5d7af6ce2fe824d1f423eb632afde124fbc1e7f35148cac29e043318b"} Oct 01 17:24:12 crc kubenswrapper[4764]: I1001 17:24:12.485365 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4scx" event={"ID":"67f9bb48-d921-4735-8705-0db6606550dd","Type":"ContainerStarted","Data":"c83b9d12fa4d14706d230a56ff0cc48d05eff8d0a47eb199b7bc64725b53a1f8"} Oct 01 17:24:12 crc kubenswrapper[4764]: I1001 17:24:12.487715 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 17:24:13 crc kubenswrapper[4764]: I1001 17:24:13.500742 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4scx" event={"ID":"67f9bb48-d921-4735-8705-0db6606550dd","Type":"ContainerStarted","Data":"e65c07ddb4eafcbaa32087ab77cdd7d817f0e42e76e5714b4239ecbd06216e0b"} Oct 01 17:24:14 crc kubenswrapper[4764]: I1001 17:24:14.510696 4764 generic.go:334] "Generic (PLEG): container finished" podID="67f9bb48-d921-4735-8705-0db6606550dd" containerID="e65c07ddb4eafcbaa32087ab77cdd7d817f0e42e76e5714b4239ecbd06216e0b" exitCode=0 Oct 01 17:24:14 crc kubenswrapper[4764]: I1001 17:24:14.510921 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4scx" event={"ID":"67f9bb48-d921-4735-8705-0db6606550dd","Type":"ContainerDied","Data":"e65c07ddb4eafcbaa32087ab77cdd7d817f0e42e76e5714b4239ecbd06216e0b"} Oct 01 17:24:15 crc kubenswrapper[4764]: I1001 17:24:15.528316 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4scx" event={"ID":"67f9bb48-d921-4735-8705-0db6606550dd","Type":"ContainerStarted","Data":"51b8a9f6b8333cf4ae45b7f7a4dab9de29ee5fbb75848a3c44a1d84204e21f17"} Oct 01 17:24:15 crc kubenswrapper[4764]: I1001 17:24:15.554218 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v4scx" podStartSLOduration=2.078611124 podStartE2EDuration="4.554197905s" podCreationTimestamp="2025-10-01 17:24:11 +0000 UTC" firstStartedPulling="2025-10-01 17:24:12.487474988 +0000 UTC m=+4915.487121823" lastFinishedPulling="2025-10-01 17:24:14.963061769 +0000 UTC m=+4917.962708604" observedRunningTime="2025-10-01 17:24:15.549267213 +0000 UTC m=+4918.548914058" watchObservedRunningTime="2025-10-01 17:24:15.554197905 +0000 UTC m=+4918.553844750" Oct 01 17:24:21 crc kubenswrapper[4764]: I1001 17:24:21.568911 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:21 crc kubenswrapper[4764]: I1001 17:24:21.569709 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:21 crc kubenswrapper[4764]: I1001 17:24:21.616946 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:22 crc kubenswrapper[4764]: I1001 17:24:22.912015 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:22 crc kubenswrapper[4764]: I1001 17:24:22.955413 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v4scx"] Oct 01 17:24:24 crc kubenswrapper[4764]: I1001 17:24:24.600132 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v4scx" podUID="67f9bb48-d921-4735-8705-0db6606550dd" containerName="registry-server" containerID="cri-o://51b8a9f6b8333cf4ae45b7f7a4dab9de29ee5fbb75848a3c44a1d84204e21f17" gracePeriod=2 Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.551472 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.612688 4764 generic.go:334] "Generic (PLEG): container finished" podID="67f9bb48-d921-4735-8705-0db6606550dd" containerID="51b8a9f6b8333cf4ae45b7f7a4dab9de29ee5fbb75848a3c44a1d84204e21f17" exitCode=0 Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.612744 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4scx" event={"ID":"67f9bb48-d921-4735-8705-0db6606550dd","Type":"ContainerDied","Data":"51b8a9f6b8333cf4ae45b7f7a4dab9de29ee5fbb75848a3c44a1d84204e21f17"} Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.612767 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4scx" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.612787 4764 scope.go:117] "RemoveContainer" containerID="51b8a9f6b8333cf4ae45b7f7a4dab9de29ee5fbb75848a3c44a1d84204e21f17" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.612774 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4scx" event={"ID":"67f9bb48-d921-4735-8705-0db6606550dd","Type":"ContainerDied","Data":"c83b9d12fa4d14706d230a56ff0cc48d05eff8d0a47eb199b7bc64725b53a1f8"} Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.646697 4764 scope.go:117] "RemoveContainer" containerID="e65c07ddb4eafcbaa32087ab77cdd7d817f0e42e76e5714b4239ecbd06216e0b" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.666698 4764 scope.go:117] "RemoveContainer" containerID="d80405a5d7af6ce2fe824d1f423eb632afde124fbc1e7f35148cac29e043318b" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.698487 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f9bb48-d921-4735-8705-0db6606550dd-catalog-content\") pod \"67f9bb48-d921-4735-8705-0db6606550dd\" (UID: \"67f9bb48-d921-4735-8705-0db6606550dd\") " Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.698709 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f9bb48-d921-4735-8705-0db6606550dd-utilities\") pod \"67f9bb48-d921-4735-8705-0db6606550dd\" (UID: \"67f9bb48-d921-4735-8705-0db6606550dd\") " Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.698859 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s67k2\" (UniqueName: \"kubernetes.io/projected/67f9bb48-d921-4735-8705-0db6606550dd-kube-api-access-s67k2\") pod \"67f9bb48-d921-4735-8705-0db6606550dd\" (UID: \"67f9bb48-d921-4735-8705-0db6606550dd\") " Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.700740 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67f9bb48-d921-4735-8705-0db6606550dd-utilities" (OuterVolumeSpecName: "utilities") pod "67f9bb48-d921-4735-8705-0db6606550dd" (UID: "67f9bb48-d921-4735-8705-0db6606550dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.706725 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f9bb48-d921-4735-8705-0db6606550dd-kube-api-access-s67k2" (OuterVolumeSpecName: "kube-api-access-s67k2") pod "67f9bb48-d921-4735-8705-0db6606550dd" (UID: "67f9bb48-d921-4735-8705-0db6606550dd"). InnerVolumeSpecName "kube-api-access-s67k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.713318 4764 scope.go:117] "RemoveContainer" containerID="51b8a9f6b8333cf4ae45b7f7a4dab9de29ee5fbb75848a3c44a1d84204e21f17" Oct 01 17:24:25 crc kubenswrapper[4764]: E1001 17:24:25.714037 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51b8a9f6b8333cf4ae45b7f7a4dab9de29ee5fbb75848a3c44a1d84204e21f17\": container with ID starting with 51b8a9f6b8333cf4ae45b7f7a4dab9de29ee5fbb75848a3c44a1d84204e21f17 not found: ID does not exist" containerID="51b8a9f6b8333cf4ae45b7f7a4dab9de29ee5fbb75848a3c44a1d84204e21f17" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.714107 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b8a9f6b8333cf4ae45b7f7a4dab9de29ee5fbb75848a3c44a1d84204e21f17"} err="failed to get container status \"51b8a9f6b8333cf4ae45b7f7a4dab9de29ee5fbb75848a3c44a1d84204e21f17\": rpc error: code = NotFound desc = could not find container \"51b8a9f6b8333cf4ae45b7f7a4dab9de29ee5fbb75848a3c44a1d84204e21f17\": container with ID starting with 51b8a9f6b8333cf4ae45b7f7a4dab9de29ee5fbb75848a3c44a1d84204e21f17 not found: ID does not exist" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.714134 4764 scope.go:117] "RemoveContainer" containerID="e65c07ddb4eafcbaa32087ab77cdd7d817f0e42e76e5714b4239ecbd06216e0b" Oct 01 17:24:25 crc kubenswrapper[4764]: E1001 17:24:25.714554 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e65c07ddb4eafcbaa32087ab77cdd7d817f0e42e76e5714b4239ecbd06216e0b\": container with ID starting with e65c07ddb4eafcbaa32087ab77cdd7d817f0e42e76e5714b4239ecbd06216e0b not found: ID does not exist" containerID="e65c07ddb4eafcbaa32087ab77cdd7d817f0e42e76e5714b4239ecbd06216e0b" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.714585 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e65c07ddb4eafcbaa32087ab77cdd7d817f0e42e76e5714b4239ecbd06216e0b"} err="failed to get container status \"e65c07ddb4eafcbaa32087ab77cdd7d817f0e42e76e5714b4239ecbd06216e0b\": rpc error: code = NotFound desc = could not find container \"e65c07ddb4eafcbaa32087ab77cdd7d817f0e42e76e5714b4239ecbd06216e0b\": container with ID starting with e65c07ddb4eafcbaa32087ab77cdd7d817f0e42e76e5714b4239ecbd06216e0b not found: ID does not exist" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.714603 4764 scope.go:117] "RemoveContainer" containerID="d80405a5d7af6ce2fe824d1f423eb632afde124fbc1e7f35148cac29e043318b" Oct 01 17:24:25 crc kubenswrapper[4764]: E1001 17:24:25.715008 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d80405a5d7af6ce2fe824d1f423eb632afde124fbc1e7f35148cac29e043318b\": container with ID starting with d80405a5d7af6ce2fe824d1f423eb632afde124fbc1e7f35148cac29e043318b not found: ID does not exist" containerID="d80405a5d7af6ce2fe824d1f423eb632afde124fbc1e7f35148cac29e043318b" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.715099 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d80405a5d7af6ce2fe824d1f423eb632afde124fbc1e7f35148cac29e043318b"} err="failed to get container status \"d80405a5d7af6ce2fe824d1f423eb632afde124fbc1e7f35148cac29e043318b\": rpc error: code = NotFound desc = could not find container \"d80405a5d7af6ce2fe824d1f423eb632afde124fbc1e7f35148cac29e043318b\": container with ID starting with d80405a5d7af6ce2fe824d1f423eb632afde124fbc1e7f35148cac29e043318b not found: ID does not exist" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.756329 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67f9bb48-d921-4735-8705-0db6606550dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67f9bb48-d921-4735-8705-0db6606550dd" (UID: "67f9bb48-d921-4735-8705-0db6606550dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.801204 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s67k2\" (UniqueName: \"kubernetes.io/projected/67f9bb48-d921-4735-8705-0db6606550dd-kube-api-access-s67k2\") on node \"crc\" DevicePath \"\"" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.801265 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f9bb48-d921-4735-8705-0db6606550dd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.801277 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f9bb48-d921-4735-8705-0db6606550dd-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.947669 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v4scx"] Oct 01 17:24:25 crc kubenswrapper[4764]: I1001 17:24:25.957365 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v4scx"] Oct 01 17:24:27 crc kubenswrapper[4764]: I1001 17:24:27.733941 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67f9bb48-d921-4735-8705-0db6606550dd" path="/var/lib/kubelet/pods/67f9bb48-d921-4735-8705-0db6606550dd/volumes" Oct 01 17:25:12 crc kubenswrapper[4764]: I1001 17:25:12.067551 4764 generic.go:334] "Generic (PLEG): container finished" podID="d047901f-1b7a-4204-8764-67631f06b45d" containerID="df5f58be06844d97f4a108f354136799efb1dfb2878f632725ecccb07a5b4639" exitCode=0 Oct 01 17:25:12 crc kubenswrapper[4764]: I1001 17:25:12.067628 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v59zf/must-gather-5vzpr" event={"ID":"d047901f-1b7a-4204-8764-67631f06b45d","Type":"ContainerDied","Data":"df5f58be06844d97f4a108f354136799efb1dfb2878f632725ecccb07a5b4639"} Oct 01 17:25:12 crc kubenswrapper[4764]: I1001 17:25:12.069333 4764 scope.go:117] "RemoveContainer" containerID="df5f58be06844d97f4a108f354136799efb1dfb2878f632725ecccb07a5b4639" Oct 01 17:25:12 crc kubenswrapper[4764]: I1001 17:25:12.234978 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v59zf_must-gather-5vzpr_d047901f-1b7a-4204-8764-67631f06b45d/gather/0.log" Oct 01 17:25:24 crc kubenswrapper[4764]: I1001 17:25:24.670925 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v59zf/must-gather-5vzpr"] Oct 01 17:25:24 crc kubenswrapper[4764]: I1001 17:25:24.671805 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-v59zf/must-gather-5vzpr" podUID="d047901f-1b7a-4204-8764-67631f06b45d" containerName="copy" containerID="cri-o://37ab9be63c2b7e4d5aff0ae89af05f455a33dfbc033c24b15abeccb9da78afe8" gracePeriod=2 Oct 01 17:25:24 crc kubenswrapper[4764]: I1001 17:25:24.679246 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v59zf/must-gather-5vzpr"] Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.208799 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v59zf_must-gather-5vzpr_d047901f-1b7a-4204-8764-67631f06b45d/copy/0.log" Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.209740 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/must-gather-5vzpr" Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.210632 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v59zf_must-gather-5vzpr_d047901f-1b7a-4204-8764-67631f06b45d/copy/0.log" Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.210889 4764 generic.go:334] "Generic (PLEG): container finished" podID="d047901f-1b7a-4204-8764-67631f06b45d" containerID="37ab9be63c2b7e4d5aff0ae89af05f455a33dfbc033c24b15abeccb9da78afe8" exitCode=143 Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.210934 4764 scope.go:117] "RemoveContainer" containerID="37ab9be63c2b7e4d5aff0ae89af05f455a33dfbc033c24b15abeccb9da78afe8" Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.246455 4764 scope.go:117] "RemoveContainer" containerID="df5f58be06844d97f4a108f354136799efb1dfb2878f632725ecccb07a5b4639" Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.304428 4764 scope.go:117] "RemoveContainer" containerID="37ab9be63c2b7e4d5aff0ae89af05f455a33dfbc033c24b15abeccb9da78afe8" Oct 01 17:25:25 crc kubenswrapper[4764]: E1001 17:25:25.304959 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37ab9be63c2b7e4d5aff0ae89af05f455a33dfbc033c24b15abeccb9da78afe8\": container with ID starting with 37ab9be63c2b7e4d5aff0ae89af05f455a33dfbc033c24b15abeccb9da78afe8 not found: ID does not exist" containerID="37ab9be63c2b7e4d5aff0ae89af05f455a33dfbc033c24b15abeccb9da78afe8" Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.305019 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ab9be63c2b7e4d5aff0ae89af05f455a33dfbc033c24b15abeccb9da78afe8"} err="failed to get container status \"37ab9be63c2b7e4d5aff0ae89af05f455a33dfbc033c24b15abeccb9da78afe8\": rpc error: code = NotFound desc = could not find container \"37ab9be63c2b7e4d5aff0ae89af05f455a33dfbc033c24b15abeccb9da78afe8\": container with ID starting with 37ab9be63c2b7e4d5aff0ae89af05f455a33dfbc033c24b15abeccb9da78afe8 not found: ID does not exist" Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.305066 4764 scope.go:117] "RemoveContainer" containerID="df5f58be06844d97f4a108f354136799efb1dfb2878f632725ecccb07a5b4639" Oct 01 17:25:25 crc kubenswrapper[4764]: E1001 17:25:25.305704 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5f58be06844d97f4a108f354136799efb1dfb2878f632725ecccb07a5b4639\": container with ID starting with df5f58be06844d97f4a108f354136799efb1dfb2878f632725ecccb07a5b4639 not found: ID does not exist" containerID="df5f58be06844d97f4a108f354136799efb1dfb2878f632725ecccb07a5b4639" Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.305734 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5f58be06844d97f4a108f354136799efb1dfb2878f632725ecccb07a5b4639"} err="failed to get container status \"df5f58be06844d97f4a108f354136799efb1dfb2878f632725ecccb07a5b4639\": rpc error: code = NotFound desc = could not find container \"df5f58be06844d97f4a108f354136799efb1dfb2878f632725ecccb07a5b4639\": container with ID starting with df5f58be06844d97f4a108f354136799efb1dfb2878f632725ecccb07a5b4639 not found: ID does not exist" Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.407024 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d047901f-1b7a-4204-8764-67631f06b45d-must-gather-output\") pod \"d047901f-1b7a-4204-8764-67631f06b45d\" (UID: \"d047901f-1b7a-4204-8764-67631f06b45d\") " Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.407122 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b67hp\" (UniqueName: \"kubernetes.io/projected/d047901f-1b7a-4204-8764-67631f06b45d-kube-api-access-b67hp\") pod \"d047901f-1b7a-4204-8764-67631f06b45d\" (UID: \"d047901f-1b7a-4204-8764-67631f06b45d\") " Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.416388 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d047901f-1b7a-4204-8764-67631f06b45d-kube-api-access-b67hp" (OuterVolumeSpecName: "kube-api-access-b67hp") pod "d047901f-1b7a-4204-8764-67631f06b45d" (UID: "d047901f-1b7a-4204-8764-67631f06b45d"). InnerVolumeSpecName "kube-api-access-b67hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.509365 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b67hp\" (UniqueName: \"kubernetes.io/projected/d047901f-1b7a-4204-8764-67631f06b45d-kube-api-access-b67hp\") on node \"crc\" DevicePath \"\"" Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.587955 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d047901f-1b7a-4204-8764-67631f06b45d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d047901f-1b7a-4204-8764-67631f06b45d" (UID: "d047901f-1b7a-4204-8764-67631f06b45d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.611493 4764 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d047901f-1b7a-4204-8764-67631f06b45d-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 17:25:25 crc kubenswrapper[4764]: I1001 17:25:25.741582 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d047901f-1b7a-4204-8764-67631f06b45d" path="/var/lib/kubelet/pods/d047901f-1b7a-4204-8764-67631f06b45d/volumes" Oct 01 17:25:26 crc kubenswrapper[4764]: I1001 17:25:26.223279 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v59zf/must-gather-5vzpr" Oct 01 17:25:51 crc kubenswrapper[4764]: I1001 17:25:51.914289 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:25:51 crc kubenswrapper[4764]: I1001 17:25:51.915091 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 17:26:21 crc kubenswrapper[4764]: I1001 17:26:21.913719 4764 patch_prober.go:28] interesting pod/machine-config-daemon-zf6qx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 17:26:21 crc kubenswrapper[4764]: I1001 17:26:21.914275 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zf6qx" podUID="2068a381-c49b-41a4-bd0d-8c525f9b30d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"